UK GDPR

UK GDPR - Data Protection After The Transition Period

UK GDPR, EU GDPR, DPA 2018, DP Regulations. Confused? Hopefully this blog will help you understand what is happening with data protection laws in the UK now that the Brexit transition period has ended.

The UK data protection authority, the Information Commissioner’s Office (ICO), is telling us that at the end of the Brexit transition period, data protection compliance should continue as usual. The key principles, rights and obligations remain the same. What then is the consequence of the Brexit transition period ending on data protection in the UK?

Most importantly, following the end of the transition period, the EU and the UK will be operating under different, albeit very similar, data protection regimes. This means that any transfer of data between the two regimes will be considered as such – i.e. between two independent data protection legal systems.

The Legislation – UK GDPR and the DP Brexit Regulations

A confusing aspect of the UK’s new data protection regime is its reference to legislation. There is mention of the ‘UK GDPR’ and ‘DP Brexit Regulations’. In order to clear up any misunderstanding it is useful to consider how data protection legislation operated before Brexit

Before Brexit data protection was mainly governed by two pieces of legislation: the General Data Protection Regulation ((EU) 2016/679) and the Data Protection Act 2018. The first being EU law and the second being the mechanism by which it was implemented into UK law.

With the coming of Brexit came a concern with what the UK government should do with all its EU law. The European Union (Withdrawal) Act 2018 sought to retain EU law already implemented in the UK, including GDPR. Simply put, retained EU Law is copied and amended before becoming UK law. The EU data protection law GDPR, in its retained form, is now known as the UK GDPR. This is in contrast to data protection law in the EU now known (in the UK) as the EU GDPR.

The Data Protection Act 2018 (DPA), although already being UK law, was also defined as retained EU law for the purposes of the European Union (Withdrawal) Act 2018 and therefore at the end of the transition period it will continue to be a main source of data protection law in the UK.

In order for the retained EU data protection law to work in the UK after the transition period it needs to be amended. The Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 (referred to here as the DP Brexit Regulations) is the legislation by which this will be achieved. The amendments made to the UK GDPR and the DPA by the DP Brexit Regulations will therefore merge to form the core of the UK’s data protection law. Organisations will need to consider two legal texts after the transition period: the UK GDPR and the DPA.

Changes made by the DP Brexit Regulations

The purpose of the DP Brexit Regulations is first and foremost to integrate EU data protection law, as it stands, into UK law after the transition period. Therefore most of the changes are relatively predictable. Here are a few:

  • The Information Commissioner (UK data protection authority) is no longer a party to EU GDPR co-operation, consistency mechanisms and will not have a seat on the European Data Protection Board (EDPB).
  • Amendments are made throughout the UK GDPR to change references to EU institutions, member states and decisions.
  • European Commission powers are transferred to the Information Commissioner or the Secretary of State. For example the Information Commissioner has the power to issue Standard Contractual Clauses (a mechanism by which data is transferred internationally).
  • Section 4 of the DPA is amended to make clear that it applies to personal data to which the UK GDPR applies and that it supplements and must be read with the UK GDPR.
  • A new section 17A in the DPA covers transfers based on adequacy regulations (a mechanism by which data is transferred internationally). The Secretary of State has the power to make “adequacy regulations” to specify that a third country, territory, sector or international organisation ensures an adequate level of data protection.

International transfers

At the end of the transition period, the UK would have been a third country under the EU GDPR, meaning that EU controllers and processors would need to ensure that an adequacy mechanism was in place to protect transfers i.e. Standard Contractual Clauses or Binding Corporate Rules.

However, on 24 December 2020, the UK and EU reached a trade and co-operation agreement addressing the arrangements following the end of the Brexit transition period on 31 December 2020 (as implemented by the European Union (Future Relationship) Act 2020).

Most significantly the agreement has introduced at least a four month period (extended another two months unless one of the parties objects) in which data can flow between the regimes without additional safeguards. The aim of the agreement is to give organisations breathing space while the Commission continues its assessment of adequacy for the UK. If the UK is granted an adequacy decision then data will continue to flow freely between the regimes after this period.

Data processed or obtained before the end of the transition period

From the end of the transition period the UK is required to continue applying “EU law on the protection of personal data” to the processing of EU personal data where the personal data was processed before the end of the transition period. It will therefore be helpful for organisations to know what data has been processed in the EU before the end of the transition period so that, should the regimes diverge, that data continues to have EU law applied to it. By contrast, personal data about UK data subjects processed in the UK before the end of the transition period will fall under the UK GDPR and DPA.

More to come - UK GDPR and EU GDPR to diverge?

The big next development in data protection and Brexit will be whether or not the commission grants the UK an adequacy decision. Organisations should have a clear idea of how they are going to confront the possibility that no adequacy decision is reached. This will mean reviewing data flows and the contracts that enable them.

The ICO is right to say that the data protection principles before Brexit will largely remain the same in the UK. The UK GDPR and DPA as a new legislative framework are more than anything else a replica of what has come before. But with an adequacy decision pending and the EU’s draft E-Privacy Regulation still being finalised and therefore without a hope of being applied in the UK, the two data protection regimes could split in significant ways in a relatively short amount of time.

If you have any questions on Brexit and data protection, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


Statement Of Objections To Amazon

EC Sends Statement Of Objections To Amazon - Big Data Law

On 10 November 2020, the European Commission announced that it has sent a statement of objections to Amazon as part of its investigation into whether Amazon's use of sensitive data from independent retailers who sell on its marketplace is in breach of Article 102 of the Treaty on the Functioning of the European Union (TFEU). The Commission has also opened a formal investigation into Amazon's allegedly discriminatory business practices.

What data is amazon collecting?

Amazon has a dual role as a platform:

  • It provides a marketplace where independent sellers can sell products directly to consumers. Amazon is the most important or dominant marketplace in many European countries.
  • It sells products as a retailer on the same marketplace, in competition with those sellers.

As a marketplace service provider, Amazon has access to non-public business data of third party sellers. This data relates to matters such as the number of ordered and shipped units of products, the sellers' revenues on the marketplace, the number of visits to sellers' offers, data relating to shipping, to sellers' past performance, and other consumer claims on products, including the activated guarantees.

Investigation into use of independent sellers’ data

In July 2019, the Commission announced that it had opened a formal investigation to examine whether Amazon's use of competitively sensitive information about marketplace sellers, their products and transactions on the Amazon marketplace constitutes anti-competitive agreements or practices in breach of Article 101 of the Treaty on the Functioning of the European Union (TFEU) and/or an abuse of a dominant position in breach of Article 102 of the TFEU.

Statement of objections to Amazon

The Commission has now sent a statement of objections to Amazon alleging that Amazon has breached Article 102 of the TFEU by abusing its dominant position as a marketplace service provider in Germany and France. Having analysed a data sample covering over 80 million transactions and around 100 million product listings on Amazon's European marketplaces, the Commission is alleging in its statement of objections to Amazon that:

  • Very large quantities of non-public seller data are available to employees of Amazon's retail business and feed into automated systems. Granular, real-time business data relating to third party sellers' listings and transactions on the Amazon platform is systematically feed into the algorithms of Amazon's retail business, which aggregates the data and uses it to calibrate Amazon's retail offers and strategic business decisions (such as which new products to launch, the price of each individual offer, the management of inventories, and the choice of the best supplier for a product).
  • This acts to the detriment of other marketplace sellers as, for example, Amazon can use this data to focus its offers in the best-selling products across product categories and to adjust its offers in light of the non-public data of competing sellers.
  • The use of non-public marketplace seller data, therefore, allows Amazon to avoid the normal risks of retail competition and to leverage its dominance in the market for the provision of marketplace services in France and Germany, which are the biggest markets for Amazon in the EU.

The Commission's concerns are not only about the insights Amazon Retail has into the sensitive business data of one particular seller, but rather about the insights that Amazon Retail has about the accumulated business data of more than 800,000 active sellers in the EU, covering more than a billion different products. Amazon is able to aggregate and combine individual seller data in real time, and to draw precise, targeted conclusions from these data.

The Commission has, therefore, come to the preliminary conclusion that the use of these data allows Amazon to focus on the sale of the best-selling products. This marginalises third party sellers and limits their ability to grow. Amazon now has the opportunity to examine the documents in the Commission's investigation file, reply in writing to the allegations in the statement of objections and request an oral hearing to present its comments on the case.

Investigation into Amazon practices regarding the “Buy Box” and Prime label

The Commission states that, as a result of looking into Amazon's use of data, it identified concerns that Amazon's business practices might artificially favour its own retail offers and offers of marketplace sellers that use Amazon's logistics and delivery services. It has, therefore, now formally initiated proceedings in a separate investigation to examine whether these business practices breach Article 102 of the TFEU.

Problems with digital platforms

In announcing these developments, EU Commission Vice-President Vestager commented that:

“We must ensure that dual role platforms with market power, such as Amazon, do not distort competition. Data on the activity of third party sellers should not be used to the benefit of Amazon when it acts as a competitor to these sellers. The conditions of competition on the Amazon platform must also be fair. Its rules should not artificially favour Amazon's own retail offers or advantage the offers of retailers using Amazon's logistics and delivery services. With e-commerce booming, and Amazon being the leading e-commerce platform, a fair and undistorted access to consumers online is important for all sellers.”

The report prepared for the Commission by three special advisers on "Competition Policy for the digital era" highlighted possible competition issues in relation to digital platforms. As part of the Digital Services Act package, the Commission is now considering the introduction of ex ante regulation for "gatekeeper" platforms, and consulted on issues related to this in June 2020

Big data regulation

It remains to be seen how these EC investigations will play out and whether the same principles can be applied to smaller online platforms. UK regulators also appear to be ramping up their interest in the overlap between competition law and digital business. Chief Executive of the UK Competition and Markets Authority (CMA), Andrea Coscelli, noted last month that the CMA is increasingly focused on “scrutinising how digital businesses use algorithms and how this could negatively impact competition and consumers” and “will be considering how requirements for auditability and explainability of algorithms might work in practice”.

If you have any questions on the EC’s statement of objections to Amazon, data protection law or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


ICO guidance on AI

ICO Guidance On AI Published - AI And Data Protection

On 30 July 2020, the Information Commissioner’s Office (ICO) published its long-awaited guidance on artificial intelligence (AI) and data protection (ICO guidance on AI), which forms part of its AI auditing framework. However, recognising that AI is still in its early stages and is developing rapidly, the ICO describes the guidance as foundational guidance. The ICO acknowledges that it will need to continue to offer new tools to promote privacy by design in AI and to continue to update the guidance to ensure that it remains relevant.

The need for ICO guidance on AI

Whether it is helping to tackle the coronavirus disease (COVID-19), or managing loan applications, the potential benefits of AI are clear. However, it has long been recognised that it can be difficult to balance the tensions that exist between some of the key characteristics of AI and data protection compliance, particularly under the General Data Protection Regulation (679/2016/EU) (GDPR).

The Information Commissioner Elizabeth Denham’s foreword to the ICO guidance on AI confirms that the underlying data protection questions for even the most complex AI project are much the same as with any new project: is data being used fairly, lawfully and transparently? Do people understand how their data is being used and are they being kept secure?

That said, there is a recognition that AI presents particular challenges when answering these questions and that some aspects of the law require greater thought. Compliance with the data protection principles around data minimisation, for example, can seem particularly challenging given that many AI systems allow machine learning to decide what information is necessary to extract from large data sets.

Scope of the ICO guidance on AI

The guidance forms part of the ICO’s wider AI auditing framework, which also includes auditing tools and procedures for the ICO to use in its audits and investigations and a soon-to-be-released toolkit that is designed to provide further practical support for organisations auditing their own AI use.

It contains recommendations on good practice for organisational and technical measures to mitigate AI risks, whether an organisation is designing its own AI system or procuring one from a third party. It is aimed at those within an organisation who have a compliance focus, such as data protection officers, the legal department, risk managers and senior management, as well as technology specialists, developers and IT risk managers. The ICO’s own auditors will also use it to inform their statutory audit functions.

It is not, however, a statutory code and there is no penalty for failing to adopt the good practice recommendations if an alternative route can be found to comply with the law. It also does not provide ethical or design principles; rather, it corresponds to the data protection principles set out in the GDPR.

Structure of the guidance

The ICO guidance on AI is set out in four parts:

Part 1. This focuses on the AI-specific implications of accountability; namely, responsibility for complying with data protection laws and demonstrating that compliance. The guidance confirms that senior management cannot simply delegate issues to data scientists or engineers, and are responsible for understanding and addressing AI risks. It considers data protection impact assessments (which will be required in the majority of AI use cases involving personal data), setting a meaningful risk appetite, the controller and processor responsibilities, and striking the required balance between the right to data protection and other fundamental rights.

Part 2. This covers lawfulness, fairness and transparency in AI systems, although transparency is addressed in more detail in the ICO’s recent guidance on explaining decisions made with AI (2020 guidance). This section looks at selecting a lawful basis for the different types of processing (for example, consent or performance of a contract), automated decision making, statistical accuracy and how to mitigate potential discrimination to ensure fair processing.

Part 3. This section covers security and data minimisation, and examines the new risks and challenges raised by AI in these areas. For example, AI can increase the potential for loss or misuse of large amounts of personal data that are often required to train AI systems or can introduce software vulnerabilities through new AI-related code. The key message is that organisations should review their risk management practices to ensure that personal data are secure in an AI context.

Part 4. This covers compliance with individual rights, including how individual rights apply to different stages of the AI lifecycle. It also looks at rights relating to solely automated decisions and how to ensure meaningful input or, in the case of solely automated decisions, meaningful review, by humans.

ICO guidance on AI - headline takeaway

According to the Information Commissioner, the headline takeaway from the ICO guidance on AI is that data protection must be considered at an early stage. Mitigation of risk must come at the AI design stage as retrofitting compliance rarely leads to comfortable compliance or practical products.

The guidance also acknowledges that, while it is designed to be integrated into an organisation’s existing risk management processes, AI adoption may require organisations to reassess their governance and risk management practices.

A landscape of guidance

AI is one of the ICO’s top three strategic priorities, and it has been working hard over the last few years to both increase its knowledge and auditing capabilities in this area, as well as to produce practical guidance for organisations.

To develop the guidance, the ICO enlisted technical expertise in the form of Doctor (now Professor) Reuben Binns, who joined the ICO as part of a fellowship scheme. It produced a series of informal consultation blogs in 2019 that were focused on eight AI-specific risk areas. This was followed by a formal consultation draft published in February 2020, the structure of which the guidance largely follows. Despite all this preparatory work, the guidance is still described as foundational.

From a user perspective, practical guidance is good news and the guidance is clear and easy to follow. Multiple layers of guidance can, however, become more difficult to manage. The ICO has already stated that the guidance has been developed to complement its existing resources, including its original Big Data, AI and Machine Learning report (last updated in 2017), and its more recent 2020 guidance.

In addition, there are publications and guidelines from bodies such as the Centre for Data Ethics and the European Commission, and sector-specific regulators such as the Financial Conduct Authority are also working on AI projects. As a result, organisations will need to start considering how to consolidate the different guidance, checklists and principles into their compliance processes.

Opportunities and risks

“The innovation, opportunities and potential value to society of AI will not need emphasising to anyone reading this guidance. Nor is there a need to underline the range of risks involved in the use of technologies that shift processing of personal data to complex computer systems with often opaque approaches and algorithms.” (Opening statement of ICO guidance on AI and data protection.)

If you have any questions on data protection law or on any of the issues raised in the ICO guidance on AI please get in touch with one of our data protection lawyers.


Schrems II

Schrems II - EDPB publishes FAQs on judgment

Following Schrems II (in the case of Data Protection Commissioner v Facebook Ireland and Maximillian Schrems) the European Data Protection Board (EDPB) has adopted a set of frequently asked questions and responses (FAQs) concerning the judgment. For more information about that decision read our blog.

The Schrems II judgement

The European Court of Justice (ECJ) has invalidated the EU Commission’s decision approving the EU-U.S. Privacy Shield because U.S. intelligence agencies can access personal data relating to EU residents in ways that are incompatible with EU personal data protection laws and EU residents lack proper enforcement rights.

In addition, the ECJ ruled that the controller-processor Standard Contractual Clauses (SCCs), another widely used mechanism for international data transfers, remain valid. However, data exporters and importers must assess, prior to any transfer, the laws of the third country to which data is transferred to determine if those laws ensure an adequate level of protection of personal data.

Moving forward

The judgment was welcomed by the EDPB because it highlights the fundamental right to privacy in the context of the transfer of personal data to third countries. In response to the ECJ's ruling that the adequacy decision for the EU-US Privacy Shield is invalid, the EDPB invited the EU and US to work together and establish a complete and effective framework that guarantees the level of protection granted to personal data in the US is essentially equivalent to that guaranteed within the EU.

Schrems II: EDPB FAQs

Although the ECJ also determined in Schrems IIthat controller to processor standard contractual clauses (SCCs) remain valid as an adequate safeguard for data transfers, the EDPB commented that:

  • No grace period - the ECJ ruling applies with immediate effect. There will be no grace period during which organisations can remedy their Privacy Shield-based data transfers. In contrast, when the US-EU Safe Harbor framework was invalidated in 2015, the Article 29 Working Party granted a grace period until an appropriate solution was found with the U.S. authorities. It did so via a statement dated 16 October 2015, stating no enforcement action would be taken until the end of January 2016. However, while there will be no EU-wide grace period, national supervisory authorities will still have discretion over when to take enforcement actions in their territory.
  • The exporter and importer of the data being transferred must look beyond the protection provided by the terms of the SCCs and assess whether the country where the data is being transferred offers adequate protection, in the context of the non-exhaustive elements set out in Article 45(2) of the GDPR. If it is determined that the country of destination does not provide an essentially equivalent level of protection to the GDPR, the exporter may have to consider adopting further protective measures in addition to using SCCs. The EDPBis considering what those additional measures could include and will report in due course.
  • The judgment highlights the importance of complying with the obligations included in the terms of the SCCs. If those contractual obligations are not or cannot be complied with, the exporter is bound by the SCCs to suspend the transfer or terminate the SCCs, or to notify its competent supervisory authority if it intends to continue transferring data.
  • Supervisory authorities (SAs) have a responsibility to suspend or prohibit a transfer of data to a third country pursuant to SCCs if those clauses are not or cannot be complied with in that third country, and the protection of the data transferred cannot be ensured by other means.
  • Implication for other transfer mechanisms including BCRs. The threshold set by the ECJ applies to all appropriate transfer mechanisms under Article 46 GDPR. U.S. law referred to by the ECJ (i.e., the Foreign Intelligence Surveillance Act and the Executive Order 12333) applies to any transfer to the U.S. via electronic means, regardless of the transfer mechanism used for such transfer. In particular, the ECJ’s judgment applies in the context of binding corporate rules (BCRs), since U.S. law will also prevail over this cross-border data transfer mechanism. Similar to the SCCs, transfers taking place based on BCRs should be assessed and appropriate supplementary measures should be taken. The EDPB states that it will further assess the consequences of the judgment on transfer mechanisms other than SCCs and BCRs (e.g., approved codes of conduct or certification mechanisms).
  • Companies can rely on the derogations set forth under Article 49 of the GDPR, provided that the conditions as interpreted by the EDPB in its guidance on Article 49 of the GDPR are met. When transferring personal data based on individuals’ consent, such consent should be explicit, specific to the particular data transfer(s) and informed, particularly regarding the risks of the transfer(s). In addition, transfers of personal data that are necessary for the performance of a contract should only take place occasionally. Further, in relation to transfers necessary for important reasons of public interest, the EDPB emphasises the need for an important public interest, as opposed to only focusing on the nature of the transferring organization. According to the EDPB, transfers based on the public interest derogation cannot become the rule and must be limited to specific situations and to a strict necessity test.

Schrems II: Further clarification expected

The EDPB is still assessing the judgment and will provide further clarification for stakeholders and guidance on transfer of personal data to third countries pursuant to the Schrems II judgment. Data exporters and importers should closely monitor upcoming developments and guidance of the EDBP and national supervisory authorities, assess their existing cross-border transfers and consider implementing supplementary legal, technical or organisational measures in order to ensure they can continue to transfer personal data to third countries lawfully. Whilst the judgement most obviously applies to data transfers with the US it also has wider implications for transfers to any country outside the EU (third countries).

If you have any questions on Schrems II or data protection law more generally please get in touch with one of our data protection lawyers.


EU-US Privacy Shield

EU-US Privacy Shield invalid: Schrems II

In Data Protection Commissioner v Facebook Ireland and Maximillian Schrems (Case C-311/18) EU:C:2020:559, the European Court of Justice (ECJ) has given its preliminary ruling that Commission Decision 2010/87 on controller to processor standard contractual clauses (SCC) is valid but that Decision 2016/1250 on the adequacy of the protection provided by the EU-US Privacy Shield is invalid.

Background

The General Data Protection Regulation ((EU) 2016/679) (GDPR) prohibits the transfer of personal data outside of the EU to a third country unless certain conditions are met. In principle, it may take place in any of the following circumstances:

  • On the basis of a European Commission adequacy decision (Article 45, GDPR).
  • Where there are appropriate safeguards in place, such as standard contractual clauses (SCCs) or Binding Corporate Rules (BCRs), and on the condition that data subjects have enforceable rights and effective legal remedies (Articles 46 and 47, GDPR).
  • A derogation for a specific situation applies, such as the data subject has given their explicit consent (Article 49, GDPR).

EU-US Privacy Shield

The EU-US Privacy Shield is a framework constructed by the US Department of Commerce and the European Commission to enable transatlantic data protection exchanges for commercial purposes.

The EU-US Privacy Shield enables companies from the EU and the US to comply with data protection requirements when transferring personal data from the EU to the US. Approved by the European Commission on 12 July 2016, the EU-US Privacy Shield replaced the Safe Harbor Principles, which the ECJ declared were an invalid level of protection within the meaning of Article 25 of the Data Protection Directive in the October 2015 decision of Maximillian Schrems v Data Protection Commissioner (Case C-362/14) [2015] EUECJ.

Schrems II Facts

In October 2015, Mr Maximillian Schrems, an Austrian lawyer and data privacy campaigner, successfully challenged the validity of the EU-US safe harbor arrangement as a legal basis for transferring personal data from Facebook Ireland to servers belonging to Facebook Inc located in the US (commonly referred to as the Schrems I judgment)

Subsequently, in July 2016, the European Commission adopted a replacement adequacy Decision 2016/1250 approving a new framework for EU-US personal data flows, the EU-US Privacy Shield.

Mr Schrems reformulated his complaint to the Irish Data Protection Commissioner, claiming that the US does not offer sufficient protection for personal data transferred to that country and sought the suspension or prohibition of future transfers of his personal data from the EU to the US, which Facebook Ireland now carries out in reliance on Decision 2010/87 on controller to processor SCCs.

One of Mr Schrems' key concerns was that the US government might access and use EU individuals' personal data contrary to rights guaranteed by the Charter of Fundamental Rights of the EU (Charter) and that EU individuals would have no remedy available to them once their personal data is transferred to the US. Under US law, internet service providers such as Facebook Inc can be required to provide information to various agencies such as the National Security Agency, the Central Intelligence Services and the Federal Bureau of Investigation and it can be further used in various surveillance initiatives such as PRISM and UPSTREAM.

Decision on controller to processor SCCs

The use of SCC’s remains valid but businesses using controller to processor SCCs (or planning to do so) now face additional burdens as they will need to conduct a Transfer Impact Assessment on whether, in the overall context of the transfer, there are appropriate safeguards in the third country for the personal data transferred out of the EU (practically speaking, the European Economic Area). EU data exporters will need to take into account not only the destination of the personal data but also, in particular, any access by public authorities and the availability of judicial redress for individuals, to ascertain whether SCCs are an appropriate mechanism and may need to put in place additional safeguards.

Decision on EU-US Privacy Shield

The limitations on the protection of personal data, transferred from the EU to the US, arising from US domestic law "on the access to and use by US public authorities, are not circumscribed in a way that satisfies requirements that are essentially equivalent to those required under EU law, by the principle of proportionality, in so far as the surveillance programmes based on those provisions are not limited to what is strictly necessary".

As regards the requirement of judicial protection, the ECJ held that the Privacy Shield Ombudsperson does not provide individuals with any cause of action before a body which offers guarantees substantially equivalent to those required by EU law, so as to ensure the independence of the Ombudsperson and the existence of rules empowering the Ombudsperson to adopt decisions that are binding on US intelligence services.

EU-US Privacy Shield - Practical points:

  • The EU-U.S. Privacy Shield is no longer valid and businesses solely relying on it to transfer personal data to the U.S. should rely on another transfer solution, including by putting SCCs in place.
  • While SCCs remain valid, the underlying transfer must be assessed on a case-by-case basis to determine whether the personal data will be adequately protected (e.g. because of potential access by law enforcement or national security agencies). This is, in effect, a Transfer Impact Assessment. This will be burdensome for small organisations but also large ones making hundreds, if not thousands, of transfers.
  • The EU Commission is now likely to issue updated SCCs. Those new clauses could bake in the Transfer Impact Assessment discussed above. While existing SCCs will hopefully be “grandfathered”, business should anticipate changes to their processes for new transfers.
  • The judgment could have a negative impact on any adequacy finding for the UK after the Brexit transition period. While there are material differences between the U.S. and UK surveillance regimes, the judgement will no doubt make the EU Commission more cautious in future adequacy assessments.
  • In the absence of an adequacy finding, transfers of personal data from the EU to the UK will be more difficult post-Brexit as EU businesses will necessarily have to consider the effect of UK government surveillance powers, in particular the Investigatory Powers Act 2016.
  • While the data protection authorities cannot grant a “grace period” as such, they may well take a gradual approach to enforcing these new requirements. As an illustration, when the Safe Harbor was struck down in 2015, data protection authorities indicated they would not take active enforcement for a few months to allow controllers to make new arrangements.

More to come…

With the publishing of updated Standard Contractual Clauses expected and the UK Adequacy decision pending, businesses handling cross-border data transfers to and from the EU or to and from the US need to keep themselves informed of the latest developments. As it stands SCC’s will need to be part of such a cross-border transfer and a ‘Transfer Impact Assessment’ will be a be a new and additional obligation.

If you have any questions on data protection law or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


GDPR Report

GDPR Report: EU Commission’s First Evaluation of the GDPR

On 24th June, just over two years after its entry into application, the European Commission published an evaluation report on the General Data Protection Regulation (the Regulation / GDPR). The GDPR report shows the Regulation has met most of its objectives, in particular by offering citizens a strong set of enforceable rights and by creating a new European system of governance and enforcement.

Scope of the GDPR report

The GDPR proved to be flexible to support digital solutions in unforeseen circumstances such as the Covid-19 crisis. The GDPR report also concludes that harmonisation across the Member States is increasing, although there is a certain level of fragmentation that must be continually monitored. It also finds that businesses are developing a compliance culture and increasingly use strong data protection as a competitive advantage. The GDPR report contains a list of actions to facilitate further the application of the Regulation for all stakeholders, especially for Small and Medium Sized companies, to promote and further develop a truly European data protection culture and vigorous enforcement.

Background to the GDPR report

The General Data Protection Regulation is a single set of rules of EU law on the protection of individuals with regard to the processing of personal data and on the free movement of such data. It strengthens data protection safeguards, provides additional and stronger rights to individuals, increases transparency, and makes all those that handle personal data more accountable and responsible. It has equipped national data protection authorities with stronger and harmonised enforcement powers and has established a new governance system among the data protection authorities. It also creates a level playing field for all companies operating in the EU market, regardless of where they are established, ensures the free flow of data within the EU, facilitates safe international data transfers and has become a reference point at global level

As stipulated in Article 97(2) of the GDPR, the report covers in particular international transfers and ‘cooperation and consistency mechanism', although the Commission has taken a broader approach in its review, in order to address issues raised by various actors during the last two years. These include contributions from the Council, the European Parliament, the EDPB, national data protection authorities and stakeholders. Key findings of the GDPR review are:

Empowering individuals to control their data

The GDPR enhances transparency and gives individuals enforceable rights, such as the right of access, rectification, erasure, the right to object and the right to data portability. Today, 69% of the population above the age of 16 in the EU have heard about the GDPR and 71% of people have heard about their national data protection authority, according to results published last week in a survey from the EU Fundamental Rights Agency. However, more can be done to help citizens exercise their rights, notably the right to data portability.

The application of the GDPR to new technologies

The GDPR report found that the Regulation has empowered individuals to play a more active role in relation to what is happening with their data in the digital transition. It is also contributing to fostering trustworthy innovation, notably through a risk-based approach and principles such as data protection by design and by default.

Enforcement of the GDPR

From warnings and reprimands to administrative fines, the GDPR provides national data protection authorities with the right tools to enforce the rules. However, they need to be adequately supported with the necessary human, technical and financial resources. Many Member States are doing this, with notable increases in budgetary and staff allocations. The GDPR report found that overall, there has been a 42% increase in staff and 49% in budget for all national data protection authorities taken together in the EU between 2016 and 2019. However, there are still stark differences between Member States.

Harmonised rules but still a degree of fragmentation and diverging approaches

The GDPR established an innovative governance system which is designed to ensure a consistent and effective application of the GDPR through the so called ‘one stop shop', which provides that a company processing data cross-border has only one data protection authority as interlocutor, namely the authority of the Member State where its main establishment is located. Between 25 May 2018 and 31 December 2019, 141 draft decisions were submitted through the ‘one-stop-shop', 79 of which resulted in final decisions. However, the GDPR report concludes that more can be done to develop a truly common data protection culture. In particular, the handling of cross-border cases calls for a more efficient and harmonised approach and an effective use of all tools provided in the GDPR for the data protection authorities to cooperate.

Advice and guidelines by data protection authorities

The EDPB is issuing guidelines covering key aspects of the Regulation and emerging topics. Several data protection authorities have created new tools, including helplines for individuals and businesses, and toolkits for small and micro-enterprises. It is essential to ensure that guidance provided at national level is fully consistent with guidelines adopted by the EDPB.

Developing a modern international data transfer toolbox

The GDPR report found that over the past two years, the Commission's international engagement on free and safe data transfers has yielded important results. This includes Japan, with which the EU now shares the world's largest area of free and safe data flows. The Commission will continue its work on adequacy, with its partners around the world. In addition and in cooperation with the EDPB, the Commission is looking at modernising other mechanisms for data transfers, including Standard Contractual Clauses, the most widely used data transfer tool. The EDPB is working on specific guidance on the use of certification and codes of conduct for transferring data outside of the EU, which need to be finalised as soon as possible. Given the European Court of Justice may provide clarifications in a judgment to be delivered on 16 July that could be relevant for certain elements of the adequacy standardthe Commission will report separately on the existing adequacy decisions after the Court of Justice has handed down its judgment.

Promoting convergence and international cooperation in the area of data protection

Over the last two years, the Commission has stepped up bilateral, regional and multilateral dialogue, fostering a global culture of respect for privacy and convergence between different privacy systems to the benefit of citizens and businesses alike. The Commission is committed to continuing this work as part of its broader external action, for example, in the context of the Africa-EU Partnership and in its support for international initiatives, such as ‘Data Free Flow with Trust'. At a time when violations of privacy rules may affect large numbers of individuals simultaneously in several parts of the world, it is time to step up international cooperation between data protection enforcers. This is why the Commission will seek authorisation from the Council to open negotiations for the conclusion of mutual assistance and enforcement cooperation agreements with relevant third countries.

Challenges for small and medium sized enterprises (SME’s)

The GDPR report noted that the Regulation, together with the Free Flow of Non-Personal Data Regulation offers opportunities to companies by fostering competition and innovation, ensuring the free flow of data within the EU and creating a level playing field with companies established outside the EU. The right to portability, coupled with an increasing number of individuals in search of more privacy-friendly solutions, have the potential to lower the barriers to entry for businesses and open the possibilities for growth based on trust and innovation. However, some stakeholders report that the application of the GDPR is challenging especially for small and medium sized enterprises.

SMEs stress in particular the importance and usefulness of codes of conduct which are tailored to their situation and which do not entail disproportionate costs. As regards certification schemes, security (including cybersecurity) and data protection by design are key elements to be considered under the GDPR and would benefit from a common and ambitious approach throughout the EU. The Commission is currently working on standard contractual clauses between controllers and processors, building on the on-going work on the modernisation of the standard contractual clauses for international transfers.

At EM Law we specialise in helping small and medium sized companies comply with the GDPR. If you have any questions on data protection law or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


Digital Marketing

Digital Marketing - Legal Issues

Digital Marketing is a growth industry with legislation struggling to keep up. Unsuprisingly though there are legal issues that digital marketing businesses need to be aware of to remain compliant. The House of Lords' 2018 report "UK advertising in a digital age" noted that digital marketingaccounted for over half of all spending on advertising in the UK for the first time in 2017. This figure is likely to only increase, especially in the aftermath of COVID-19. This article provides some background into the types of digital marketing and some of the legal issues to consider in this context.

Digital marketing formats

The Digital Adspend study produced by industry body the Internet Advertising Bureau (IAB) and accountants PricewaterhouseCoopers, breaks down 2017 digital marketing spend by format, as follows:

Paid-for search: £5.82bn, of which smartphone spend was £2.62bn. This is essentially sponsored search results, where advertisers pay to have their details presented at the top of a search results page or prominently featured elsewhere on the page.

Display: £4.18bn, within which falls:

  • Online video:£1.61bn, of which smartphone spend was £1.17bn. An example is the pre-roll advert which appears before you watch a YouTube clip, or videos which start playing as the page loads or when your mouse scrolls over them.
  • Banners and standard display formats:£1.31bn, of which smartphone spend was £418m. These are the obvious adverts and include those which appear across the top of the screen (banner adverts) or in a sidebar, overlay adverts (which pop up on-screen and have to be clicked to close) and interstitial adverts (full-screen adverts that pop up between expected content, for example before a target page appears on the screen).

Native: £1.03bn, of which smartphone spend was £895m. An advertorial is native advertising, as are adverts which appear to be recommendations by the publisher ("you might also like"), influencer marketing on social media and adverts which appear to be search results.

Classified and other: £1.47bn. Classified advertising is advertising in an online directory or marketplace (for example, Rightmove, Auto Trader and Gumtree).

Commentators note that the biggest increase recently has been in spend on advertising targeting mobile phone users, in particular using a video format.

Key industry players

The CMA's Final Report on its Digital Marketing Market Study estimates that search advertising revenues totalled around £7.3 billion in 2019, of which more than 90% was earned by Google. Total spend on display advertising was worth £5.5 billion, of which it is estimated more than half went to Facebook.

Google receives revenue from its search engine and other brands such as YouTube, Google Maps and Google Play (an app and digital media store). Google sells advertising space on its own and other sites through Google Ads, and provides services to buy and optimise campaigns on Google via its Google Marketing Platform.

Digital Marketing Legal Issues

Adverts must be obviously identifiable as such.

All advertising must be obviously identifiable as advertising. This is a requirement under:

The Consumer Protection from Unfair Trading Regulations 2008 (SI 2008/1277) (CPUT) which implement the Unfair Commercial Practices Directive (2005/29/EC) (UCPD):

  • A failure to identify commercial intent, unless this is already apparent from the context, is a misleading omission.
  • Using editorial content in the media to promote a product where a trader has paid for the promotion without making that clear in the content or by images or sounds clearly identifiable by the consumer (advertorial) is a prohibited commercial practice.
  • Falsely claiming or creating the impression that the trader is not acting for purposes relating to his trade, business, craft or profession, or falsely representing oneself as a consumer is a prohibited commercial practice.

The Electronic Commerce (EC Directive) Regulations 2002 (SI 2002/2013) (E-Commerce Regulations) which implement the E-Commerce Directive (2000/31/EC):

  • Service providers must ensure that any commercial communication provided by them which constitutes or forms part of an information society service (which would include all advertising) is clearly identifiable as a commercial communication.

The UK Code of Non-broadcast Advertising and Direct & Promotional Marketing (CAP Code):

  • Marketing communications must be obviously identifiable as such.
  • Marketing communications must not falsely claim or imply that the marketer is acting as a consumer, or for purposes outside its trade, business, craft or profession; marketing communications must make clear their commercial intent, if that is not obvious from the context.
  • Marketers and publishers must make clear that advertorials are marketing communications; for example, by heading them "advertisement feature".

Information obligations on digital advertisers

Online advertisers need to:

  • Provide certain information about themselves on their websites.
  • Include certain information about themselves and their products in their online adverts.

These obligations, which apply to "information society service" providers, derive from the E-Commerce Regulations which implement the E-Commerce Directive (2000/31/EC) (E-Commerce Directive).

Information advertisers must include on websites

The information the advertiser must include on websites consists of:

  • Its name.
  • The geographic address at which it is established.
  • Details, including an email address, which make it possible to contact the advertiser rapidly and communicate with it in a direct and effective manner.
  • Where the advertiser is registered in a trade (or similar) register available to the public, details of the register in which the service provider is entered and its registration number, or equivalent means of identification in that register.
  • Where the provision of the service is subject to an authorisation scheme, the particulars of the relevant supervisory authority. Advertising itself is not subject to an authorisation scheme in the UK, but the advertiser's business may be.
  • The advertiser's VAT number.
  • Where the advertiser exercises a regulated profession:
  • the details of any professional body or similar institution with which the advertiser is registered;
  • the advertiser's professional title and the EEA state where that title has been granted; and
  • a reference to the professional rules applicable to the service provider in the member state of establishment, and the means to access them.

Information requirements for online adverts

An information society service provider (which includes any online advertiser) must ensure that any commercial communication provided by it as part of an information society service (which would include all digital marketing) shall:

  • Be clearly identifiable as a commercial communication.
  • Clearly identify the person on whose behalf the commercial communication is made.
  • Clearly identify as such any promotional offer (including any discount, premium or gift) and ensure that any conditions which must be met to qualify for it are easily accessible and presented clearly and unambiguously.
  • Clearly identify as such any promotional competition or game and ensure that any conditions for participation are easily accessible and presented clearly and unambiguously.

Digital Marketing: Controls on the use of personal data and online behavioural advertising (OBA)

The digital environment offers advertisers the opportunity to track users' online behaviour to build a profile of their interests and target advertising at them. This practice is known as "online behavioural advertising" (OBA) or sometimes as interest-based advertising (IBA).

Information is generally collected using online identifiers (such as cookies, internet protocol (IP) addresses, radio frequency identification (RFID) tags, advertising IDs, pixel tags, account handles and device fingerprints) which can be used variously to note information such as searches conducted, content viewed, purchases made and the user's location. Data about browsing habits can be combined with information about the user obtained via registrations and purchases.

OBA may be conducted by a website owner solely based on activity on its own site (first-party OBA) or by a third party tracking activity across multiple websites and user devices and serving adverts for products not necessarily sold on the website being viewed (third-party OBA).

Examples of OBA include:

  • Advertising (such as pop-ups and banners) for products a user is likely to be interested in based on their interests, as revealed by their browsing habits or searches.
  • Retargetingof adverts for products a user has viewed, encouraging them to go back and make or complete a purchase.
  • Advertising to a mobile phone promoting a cafe which a user is passing near to.

Advertisers need to be aware that if they have collected personal data at any stage in the process enabling them to target advertising at individuals, they will be classified as a data controller unless they are acting on behalf of another data controller in which case they may be a data processor. A data controller must notify the individuals whose personal data they are using about who they are, what personal data they are collecting and what they are using that data for. They must also only process that data under one of the specified lawful bases. So if, for example, an advertiser is processing personal data relating to an individual’s political or religious beliefs, the advertiser will need to obtain consent to such processing from the individual.

Cookies

The Privacy and Electronic Communications (EC Directive) Regulations 2003 (SI 2003/2426) require the user's consent to the use of non-essential cookies and similar technologies on their devices, including computers or mobiles, but also other equipment such as wearable technology, smart TVs, and connected devices including the ‘Internet of Things’.

If the advertiser’s cookies are collecting personal data then the advertiser will also need to comply with data protection laws as a data controller.

A short introduction

Digital marketing can give rise to many legal issues and what has been mentioned here is only a short overview. The content of adverts and websites and the use of personal data need to be considered from the outset.

EM law are experts in media, technology and data protection law. Please contact us if you need any help with digital marketing legal issues.


Big Data

Big Data – AI and Machine Learning

The use of computers and the internet has allowed unprecedented amounts of data to be collected and used for a variety of ends. Big data technology represents the most advanced and sizeable use of this new asset. The size and extent of such operations come up against a number of regulatory barriers. Most notably the General Data Protection Regulation (EU) 2016/679 (GDPR).

What is Big Data?

Big data is the harnessing, processing and analysis of digital data in huge and ever-increasing volume, variety and velocity. It has quickly risen up the corporate agenda as organisations appreciate that they can gain advantage through valuable insights about their customers and users through the techniques that are rapidly developing in the data world.

Much big data (for example, climate and weather data) is not personal data. Personal data relates to an identifiable living individual. For data that is or could be personal data, data protection legislation in particular the GDPR must be carefully considered.

Brexit

During the transition period (ends 31 December 2020 unless extended) and after organisations should, as the ICO has noted, continue data protection compliance as usual. The key principles, rights and obligations will remain the same and organisations already complying with the GDPR should be in a good position to comply with the post-Brexit data protection regime.

Big Data Analytics, Artificial Intelligence and Machine Learning

Being able to use big data is critical to the development of Artificial Intelligence (AI) and machine learning. AI is the ability of a computer to perform tasks commonly associated with human beings. In particular, AI can cope with, and to a large extent is predicated on, the analysis of huge amounts of data in its varying shapes, sizes and forms.

Machine learning is a set of techniques that allows computers to ‘think’ by creating mathematical algorithms based on accumulated data.

Big data, AI and machine learning are linked as described by the ICO:

“In summary, big data can be thought of as an asset that is difficult to exploit. AI can be seen as a key to unlocking the value of big data; and machine learning is one of the technical mechanisms that underpins and facilitates AI. The combination of all three concepts can be called "big data analytics”. (Paragraph 11 of ICO: Big data and data protection 2017.)

Big data analytics differs from traditional data processing in the following ways:

  • It uses complex algorithms for processing data. This usually involves a “discovery” phase to find relevant correlations (which can be a form of machine learning) so that algorithms can be created.
  • There is limited transparency on how these algorithms work and how data is processed. As vast amounts of data are processed through massive networks, a “black box” effect is created that makes it very difficult to understand the reasons for decisions made by the algorithms.
  • There is a tendency to collect “all the data” as it is more easily available rather than limiting the analytics to random samples or statistically representative samples.
  • Often data is re-used for a different purpose for which it was originally collected, often because it is obtained from third parties.
  • It usually involves data from new sources such as the Internet of Things (IoT) and “observed” data that has been generated automatically, for example by tracking online behaviour rather than data provided by individuals. In addition, new “derived” or “inferred” data produced by the algorithms is used further in the analytics.

Big Data and Data protection

Managing compliance with the GDPR will play a large part in big data management projects involving data harvested from the expanding range of available digital sources. Many organisations will already have an established data protection governance structure and policy and compliance framework in place and these can be helpful as pathfinders towards structured data governance.

Controller or processor?

Under Article 4(7) of the GDPR, a person who determines “the purposes and means” of processing personal data is a controller and under Article 4(8), a processor just processes personal data on behalf of the controller.

Correctly assessing whether an organisation is a controller or a processor in the context of the collection of massive amounts of data is therefore critical to the GDPR compliant structuring of the relationship and to allocating risk and responsibility.

However, the borderline between controller and processor can be fuzzy in practice. Where it lies in the AI context was considered for the first time in the UK in the ICO’s July 2017 decision on an agreement between the Royal Free Hospital and Google DeepMind. Under the agreement, DeepMind used the UK’s standard, publicly available acute kidney injury (AKI) algorithm to process personal data of 1.6m patients in order to test the clinical safety of Streams, an AKI application that the hospital was developing. The ICO ruled that the hospital had failed to comply with data protection law and, as part of the remediation required by the ICO, the hospital commissioned law firm Linklaters to audit the system. The hospital published the audit report in May 2018, which found (at paragraph 20.7) that the agreement had properly characterised DeepMind as a processor not a controller.

Things important to this characterisation were that the algorithm was simplistic and its use had been mandated by the NHS. Understanding whether an organisation is a processor or controller is a complex issue and seeking advice on the matter may be crucial to understand potential liabilities for those using big data.

Personal data

In the context of big data, it is worth considering whether personal data can be fully anonymised, in which case taking it outside data protection requirements. This is noted in Recital 26 of the GDPR which says that:

"the principles of data protection should therefore not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable".

However, personal data which has been pseudonymised, in other words could still identify an individual in conjunction with additional information, is still classed as personal data.

Profiling

The GDPR includes a definition of profiling that is relevant to the processing of big data. Profiling is defined as any form of automated processing of personal data used to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict the following: performance at work; economic situation; health; personal preferences; interests; reliability; behaviour; location; movements. (Article 4(4), GDPR.)

The GDPR includes data subject rights in relation to automated decision making, including profiling. The fact that profiling is taking place must be disclosed to the individual, together with information about the logic involved, as well as the significance and the envisaged consequences for such processing.

Individuals have the right not to be subject to a decision based solely on automated processing (which includes profiling), which produces legal effects concerning them or similarly significantly affects them (Article 22(1), GDPR). However, this right will not apply in certain cases, for example if the individual has given explicit consent, although suitable measures must be implemented to protect the data subjects.

Fair processing

In the ICO Big Data Paper 2017, the ICO emphasises the importance of fairness, transparency and meeting the data subject’s reasonable expectations in data processing. It states that transparency about how the data is used will be an important element when assessing compliance. It also highlights the need to consider the effect of the processing on the individuals concerned as well as communities and societal groups concerned. Similarly, the EDPS 2015 opinion stresses that organisations must be more transparent about how they process data, afford users a higher degree of control over how their data is used, design user friendly data protection into their products and services and become more accountable for what they do.

Transparency

As well as the general requirement for transparency in Article 4(1)(a), the GDPR includes specific obligations on controllers to provide data subjects with certain prescribed information (typically done in the form of a privacy notice) (Articles 13 and 14, GDPR).

The ICO Big Data Paper 2017 notes that the complexity and opacity of data analytics can lead to mistrust and potentially be a barrier to data sharing, particularly in the public sector. In the private sector, it can lead to reduced competitiveness from lack of consumer trust. Therefore privacy notices are a key tool in providing transparency in the data context. In relation to privacy notices, the Paper suggests using innovative approaches such as videos, cartoons, icons and just-in-time notifications, as well as a combination of approaches to make complex information easier to understand.

An introduction

This blog is no more than an introduction and summary of some of the legal issues raised by big data. In many ways the GDPR was created in response to such activity and therefore the extent of its applicability to the topic is unsurprising. Any organisation looking to undertake such a project should be aware of regulations in a way that allows compliance to be built into an operating system.

If you have any questions on data protection law or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


Wm Morrison Supermarkets plc

Data Breach Claims – Wm Morrison Supermarkets plc

In Wm Morrison Supermarkets plc v Various Claimants [2020] UKSC 12, the Supreme Court has overturned judgments of the High Court and Court of Appeal and decided that a supermarket was not vicariously liable for unauthorised breaches of the Data Protection Act 1998 committed by an employee.

Wm Morrison Supermarkets plc v Various Claimants - the facts

In 2013, Mr Skelton, who was then employed by Wm Morrison Supermarkets plc (Morrisons) as an internal IT auditor, was provided with a verbal warning for minor misconduct. Subsequently, he developed an irrational grudge against his employer. After being asked by Morrisons to provide payroll data for the entire workforce to external auditors, Mr Skelton copied the data onto a USB stick. He took the USB stick home and posted the data on the internet, using another employee's details in an attempt to conceal his actions. He also sent this data to three national newspapers, purporting to be a concerned member of the public.

The newspapers did not publish the data, but one newspaper alerted Morrisons, who immediately took steps to remove the data from the internet, contact the police and begin an internal investigation. Morrisons spent £2.26 million dealing with the aftermath of the disclosure, a large proportion of which was spent on security measures for its employees. Mr Skelton was arrested and ultimately convicted of criminal offences under the Computer Misuse Act 1990 and section 55 of the DPA 1998, which was in force at the time.

The claimants in this case were 9,263 of Morrisons' employees or former employees. They claimed damages from Morrisons in the High Court for misuse of private information and breach of confidence, and for breach of its statutory duty under section 4(4) of the DPA 1998. The claimants alleged that Morrisons was either primarily liable under those heads of claim or vicariously liable for Mr Skelton's wrongful conduct.

Data Protection Act 1998

This case was decided under the Data Protection Act 1998 (DPA 1998) which was applicable at the time. The DPA 1998 implemented the Data Protection Directive (95/46/EEC) and imposed broad obligations on those who collect personal data (data controllers), as well as conferring broad rights on individuals about whom data is collected (data subjects). Section 4(4) of the DPA 1998 provided that a data controller must comply with eight data protection principles in relation to all personal data with respect to which they are a controller.

Under section 13(1), any breach of the DPA 1998 which caused damage entitled the victim to compensation for that damage. Section 13(2) provided as follows:

"An individual who suffers distress by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that distress if the individual also suffers damage by reason of the contravention."

Under section 13(3), it was a defence to any proceedings under section 13 for a person, or in this case Morrisons, to prove that they had taken such care as was reasonably required in all the circumstances to comply with the relevant requirement.

Vicarious liability

It was also crucial to consider whether Morrisons could be vicariously liable for their employee’s action in this instance. Employers will be liable for torts committed by an employee under the doctrine of vicarious liability where there is a sufficient connection between the employment and the wrongdoing. There is a two-stage test:

  • Is there a relationship between the primary wrongdoer and the person alleged to be liable which is capable of giving rise to vicarious liability?
  • Is the connection between the employment and the wrongful act or omission so close that it would be just and reasonable to impose liability?

In Lister v Hesley Hall Ltd [2001] UKHL 22, the House of Lords characterised the second stage as a "sufficient connection" test. The question was whether the torts were "so closely connected with [the] employment that it would be fair and just to hold the employers vicariously liable".

In Mohamud v Wm Morrison Supermarkets plc [2016] UKSC 11 (Mohamud), the Supreme Court held that the supermarket was vicariously liable for an employee's unprovoked violent assault on a customer. It found that there was a sufficiently close connection between the assault and the employee's job of attending to customers, such that the employer should be held vicariously liable

Wm Morrison Supermarkets plc - Decision

Morrisons was not vicariously liable for Mr Skelton's actions. It found that the Court of Appeal had misunderstood the principles governing vicarious liability in the following respects:

  • The disclosure of the data on the internet did not form part of Mr Skelton's functions or field of activities. This was not an act which he was authorised to do.
  • Although there was a close temporal link and an unbroken chain of causation linking the provision of the data to Mr Skelton for the purpose of transmitting it to the auditors and his disclosing it on the internet, a temporal or causal connection did not in itself satisfy the close connection test.
  • The reason why Mr Skelton acted wrongfully was not irrelevant. Whether he was acting on his employer's business or for purely personal reasons was highly material.

The mere fact that Mr Skelton's employment gave him the opportunity to commit the wrongful act was not sufficient to warrant the imposition of vicarious liability. It was clear that Mr Skelton was not engaged in furthering his employer's business when he committed the wrongdoing. On the contrary, he was pursuing a personal vendetta. His wrongful conduct was not so closely connected with acts which he was authorised to do that it could fairly and properly be regarded as done by him while acting in the ordinary course of his employment.

Comment

This decision will provide welcome confirmation for employers that they will not always be liable for data breaches committed by rogue employees. It similarly provides helpful clarification for practitioners on the way in which the judgment in Mohamud should be applied in future cases concerning vicarious liability.

The facts in this case were extreme. It seems that Morrisons were wholly unaware of the grudge held by Mr Skelton. Mr Skelton also took extraordinary actions to cover up what he had done and even to frame another employee.

Unanswered questions

Had Morrisons been found vicariously liable for Mr Skelton’s actions, the employees who made the claims would have had to prove that they suffered ‘distress, anxiety, upset and damage’ by the mishandling of their personal information. A supreme court ruling on the issue would have provided a helpful benchmark to those wanting to understand more about how our courts quantify compensation for data breaches.

Moving forward

Employers should take away from the judgment that although this case was decided under the previous data protection regime, the DPA 1998 and the GDPR are based on broadly similar principles. Therefore the GDPR and Data Protection Act 2018 (DPA 2018) will not be a barrier to vicarious liability actions in data privacy proceedings commenced under the current regime.

Additionally, the GDPR makes compliance far more onerous for controllers and risks exposure to the huge revenue-based fines and data subject compensation claims for breaches of the GDPR and DPA 2018. This includes failing to safeguard data to statutory standards and neglect to have governance in place to curb the malicious acts of rogue employees.

The success of Morrisons in bringing to an end the threat under this case of being subject to a group action for compensation follows Google LLC being granted freedom to appeal against the Court of Appeal's order in Lloyd v Google LLC [2019] EWCA Civ 1599 and is another significant development in the progress of representative class actions in the UK legal system.

If you have any questions on data protection law or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


COVID-19 Contact Tracing Apps - Privacy Concerns

Contact Tracing Apps – Privacy Concerns

Contact tracing apps are being developed by governments and private enterprises to fight COVID-19. Their design and use however raise serious privacy concerns.

How do contact tracing apps work?

Contact tracing apps are mobile software applications designed to help identify individuals who may have been in contact with another person.

In the context of COVID-19 this means that anyone with the app who has been diagnosed with the virus or has self-diagnosed can enter that information into the app. Then, via the use of Bluetooth, anyone who has come, or comes, into contact with that diagnosed or self-diagnosed person will be notified by the app. If you are notified of such contact then you can take steps to self-quarantine or otherwise manage your exposure. This all relies upon individuals carrying their mobile phones at all times with Bluetooth activated which has cast doubt on their potential effectiveness.

Why adopt contact tracing apps?

By tracing the contacts of infected individuals, testing them for infection, treating the infected and tracing their contacts in turn, public health authorities aim to reduce infections in the population. Diseases for which contact tracing is commonly performed include tuberculosis, vaccine-preventable infections like measles, sexually transmitted infections (including HIV), blood-borne infections, some serious bacterial infections, and novel infections (e.g. coronavirus).

Privacy issues with contact tracing apps

Numerous applications are in development, with official government support in some territories and jurisdictions. Several frameworks for building contact tracing apps have been developed. Privacy concerns have been raised, especially about systems that are based on tracking the geographical location of app users.

Less intrusive alternatives include the use of Bluetooth signals to log a user's proximity to other mobile phones. On 10 April 2020, Google and Apple jointly announced that they would integrate functionality to support such Bluetooth-based apps directly into their Android and iOS operating systems.

These Bluetooth signals offer greater privacy protection because they operate on an anonymous basis. Therefore someone who comes into contact with an infected person will not have any information besides the fact that they have come into contact with an infected person. Rather than receiving any unnecessary information such as a unique identifying code or the name of the infected person.

ICO’s blog

The Information Commissioner (IC), Elizabeth Denham, has published a blog setting out data protection considerations for organisations using contact tracing and location data technologies in connection with the COVID-19 pandemic.

While the IC is maintaining a pragmatic and flexible approach to data protection compliance during the pandemic, the IC reminds organisations that the public must remain assured that their data will be processed lawfully in connection with the use of technology to track the spread of COVID-19 by individuals.

To help achieve the IC's twin goals of maintaining public trust and promoting compliance, the blog includes a series of questions for organisations to bear in mind when using new technologies to combat the pandemic. It focusses on compliance with data protection requirements under Article 25 of the General Data Protection Regulation ((EU) 2016/679) (GDPR), the data minimisation and storage limitation principles under Article 5(1)and data subject rights generally under the GDPR.

The IC asks organisations to consider the following questions:

  • Have you demonstrated how privacy is built into the processor technology?
  • Is the planned collection and use of personal data necessary and proportionate?
  • What control do users have over their data?
  • How much data needs to be gathered and processed centrally?
  • When in operation, what are the governance and accountability processes in your organisation for ongoing monitoring and evaluation of data processing, that is to ensure it remains necessary and effective, and to ensure that the safeguards in place are still suitable?
  • What happens when the processing is no longer necessary?

The IC extends an offer to assist organisations with these processes, by providing guidance and tools to consider data protection requirements in the planning and development phase for projects adopting new technology, and by performing an audit of the measures and processes implemented by an organisation when the project has become operational.

In practice

The Information Commissioner's Office (ICO) has published a discussion document setting out its expectations and recommended best practice for the development and deployment of COVID-19 contact tracing apps.

The document was published in advance of Information Commissioner Elizabeth Denham's and Executive Director of Technology and Innovation Simon McDougall's appearance before the Human Rights Joint Committee on 4 May 2020 and is intended to help NHSX and other developers of contact tracing apps comply with information provision and data protection by default and design requirements under the GDPR.

Key principles and recommendations for developers to consider include

  • Performing a Data Protection Impact Assessment (DPIA) prior to implementation of the app and refreshing the DPIA whenever the app is updated during its life cycle.
  • Being transparent with users and providing them with clear information about the purpose and design choices for the app and the benefits the app seeks to deliver for both users and the NHS. Users must also be fully informed about the data to be processed by the app before the processing takes place.
  • Complying with data minimisation, retention and security principles under Articles 5(1) and 32 of the GDPR.
  • Ensuring participation is voluntary and users can opt in and out of participation and exercise their data subject rights (including rights of access, erasure, restriction and rectification) with ease. This could involve the developer providing users with a dedicated privacy control panel or dashboard.
  • Relying on valid user consent or an alternative lawful basis under Article 6(1) of the GDPR for the processing of personal data where this is necessary and more appropriate, such as performance of a task in the public interest (particularly where an app is developed by or on behalf of a public health authority).
  • The collection of personal data relating to health shall be allowed only where the processing is either based on explicit consent, is necessary for reasons of public interest in the area of public health, is for health care purposes, or is necessary for scientific research or statistical purposes.

The ICO will keep these recommendations under review and remains open to feedback.

What does this mean for businesses?

If contact tracing apps are designed in line with ICO guidance, businesses looking to monitor employees can have confidence in asking employees to use such apps. In all likelihood the NHSX app will be used in the UK and therefore businesses should be aware of how that app is being developed.

NHSX development

On 12 April 2020, Matthew Hancock, the Minister for Health and Social Care and the politician directly responsible for the NHS, announced that the NHS was developing a mobile app that will allow for contact tracing. The app is being developed by NHSX, a specialist unit responsible for digital transformation in the NHS.

In response to the Information Commissioner’s approach, NHSX has stated that they are prioritising security and privacy in all stages of the app’s design. They are planning to publish their security designs and the source code of the app to demonstrate this. Furthermore, they have confirmed that all data gathered by the app will only be used for NHS care, management, evaluation and research, and that individuals will be able to delete the app and their data at any point.

Constraints

Two key constraints for contact tracing apps to be effective:

  • 80 per cent or more of the UK population who own a smartphone need to download it; and
  • the UK needs to test more than 100,000 people a day.

This is because contact tracing relies on large numbers of citizens being involved in the effort.

Encouraged technology

The UK Information Commissioner, Elizabeth Denham, has been supportive of the development of contact tracing apps. On 17 April she stated that “data protection laws [should] not get in the way of innovative use of data in a public health emergency – as long as the principles of the law (transparency, fairness and proportionality) are applied. The same approach applies to the use of contact tracing applications.”

Even though they are encouraged, organisations developing contact tracing apps and using them need to be conscious of the privacy issues.

If you have any questions on technology law, data protection law or on any of the issues raised in this article please get in touch with one of our data protection and technology lawyers.