Schrems II

Schrems II - EDPB publishes FAQs on judgment

Following Schrems II (in the case of Data Protection Commissioner v Facebook Ireland and Maximillian Schrems) the European Data Protection Board (EDPB) has adopted a set of frequently asked questions and responses (FAQs) concerning the judgment. For more information about that decision read our blog.

The Schrems II judgement

The European Court of Justice (ECJ) has invalidated the EU Commission’s decision approving the EU-U.S. Privacy Shield because U.S. intelligence agencies can access personal data relating to EU residents in ways that are incompatible with EU personal data protection laws and EU residents lack proper enforcement rights.

In addition, the ECJ ruled that the controller-processor Standard Contractual Clauses (SCCs), another widely used mechanism for international data transfers, remain valid. However, data exporters and importers must assess, prior to any transfer, the laws of the third country to which data is transferred to determine if those laws ensure an adequate level of protection of personal data.

Moving forward

The judgment was welcomed by the EDPB because it highlights the fundamental right to privacy in the context of the transfer of personal data to third countries. In response to the ECJ's ruling that the adequacy decision for the EU-US Privacy Shield is invalid, the EDPB invited the EU and US to work together and establish a complete and effective framework that guarantees the level of protection granted to personal data in the US is essentially equivalent to that guaranteed within the EU.

Schrems II: EDPB FAQs

Although the ECJ also determined in Schrems IIthat controller to processor standard contractual clauses (SCCs) remain valid as an adequate safeguard for data transfers, the EDPB commented that:

  • No grace period - the ECJ ruling applies with immediate effect. There will be no grace period during which organisations can remedy their Privacy Shield-based data transfers. In contrast, when the US-EU Safe Harbor framework was invalidated in 2015, the Article 29 Working Party granted a grace period until an appropriate solution was found with the U.S. authorities. It did so via a statement dated 16 October 2015, stating no enforcement action would be taken until the end of January 2016. However, while there will be no EU-wide grace period, national supervisory authorities will still have discretion over when to take enforcement actions in their territory.
  • The exporter and importer of the data being transferred must look beyond the protection provided by the terms of the SCCs and assess whether the country where the data is being transferred offers adequate protection, in the context of the non-exhaustive elements set out in Article 45(2) of the GDPR. If it is determined that the country of destination does not provide an essentially equivalent level of protection to the GDPR, the exporter may have to consider adopting further protective measures in addition to using SCCs. The EDPBis considering what those additional measures could include and will report in due course.
  • The judgment highlights the importance of complying with the obligations included in the terms of the SCCs. If those contractual obligations are not or cannot be complied with, the exporter is bound by the SCCs to suspend the transfer or terminate the SCCs, or to notify its competent supervisory authority if it intends to continue transferring data.
  • Supervisory authorities (SAs) have a responsibility to suspend or prohibit a transfer of data to a third country pursuant to SCCs if those clauses are not or cannot be complied with in that third country, and the protection of the data transferred cannot be ensured by other means.
  • Implication for other transfer mechanisms including BCRs. The threshold set by the ECJ applies to all appropriate transfer mechanisms under Article 46 GDPR. U.S. law referred to by the ECJ (i.e., the Foreign Intelligence Surveillance Act and the Executive Order 12333) applies to any transfer to the U.S. via electronic means, regardless of the transfer mechanism used for such transfer. In particular, the ECJ’s judgment applies in the context of binding corporate rules (BCRs), since U.S. law will also prevail over this cross-border data transfer mechanism. Similar to the SCCs, transfers taking place based on BCRs should be assessed and appropriate supplementary measures should be taken. The EDPB states that it will further assess the consequences of the judgment on transfer mechanisms other than SCCs and BCRs (e.g., approved codes of conduct or certification mechanisms).
  • Companies can rely on the derogations set forth under Article 49 of the GDPR, provided that the conditions as interpreted by the EDPB in its guidance on Article 49 of the GDPR are met. When transferring personal data based on individuals’ consent, such consent should be explicit, specific to the particular data transfer(s) and informed, particularly regarding the risks of the transfer(s). In addition, transfers of personal data that are necessary for the performance of a contract should only take place occasionally. Further, in relation to transfers necessary for important reasons of public interest, the EDPB emphasises the need for an important public interest, as opposed to only focusing on the nature of the transferring organization. According to the EDPB, transfers based on the public interest derogation cannot become the rule and must be limited to specific situations and to a strict necessity test.

Schrems II: Further clarification expected

The EDPB is still assessing the judgment and will provide further clarification for stakeholders and guidance on transfer of personal data to third countries pursuant to the Schrems II judgment. Data exporters and importers should closely monitor upcoming developments and guidance of the EDBP and national supervisory authorities, assess their existing cross-border transfers and consider implementing supplementary legal, technical or organisational measures in order to ensure they can continue to transfer personal data to third countries lawfully. Whilst the judgement most obviously applies to data transfers with the US it also has wider implications for transfers to any country outside the EU (third countries).

If you have any questions on Schrems II or data protection law more generally please get in touch with one of our data protection lawyers.


EU-US Privacy Shield

EU-US Privacy Shield invalid: Schrems II

In Data Protection Commissioner v Facebook Ireland and Maximillian Schrems (Case C-311/18) EU:C:2020:559, the European Court of Justice (ECJ) has given its preliminary ruling that Commission Decision 2010/87 on controller to processor standard contractual clauses (SCC) is valid but that Decision 2016/1250 on the adequacy of the protection provided by the EU-US Privacy Shield is invalid.

Background

The General Data Protection Regulation ((EU) 2016/679) (GDPR) prohibits the transfer of personal data outside of the EU to a third country unless certain conditions are met. In principle, it may take place in any of the following circumstances:

  • On the basis of a European Commission adequacy decision (Article 45, GDPR).
  • Where there are appropriate safeguards in place, such as standard contractual clauses (SCCs) or Binding Corporate Rules (BCRs), and on the condition that data subjects have enforceable rights and effective legal remedies (Articles 46 and 47, GDPR).
  • A derogation for a specific situation applies, such as the data subject has given their explicit consent (Article 49, GDPR).

EU-US Privacy Shield

The EU-US Privacy Shield is a framework constructed by the US Department of Commerce and the European Commission to enable transatlantic data protection exchanges for commercial purposes.

The EU-US Privacy Shield enables companies from the EU and the US to comply with data protection requirements when transferring personal data from the EU to the US. Approved by the European Commission on 12 July 2016, the EU-US Privacy Shield replaced the Safe Harbor Principles, which the ECJ declared were an invalid level of protection within the meaning of Article 25 of the Data Protection Directive in the October 2015 decision of Maximillian Schrems v Data Protection Commissioner (Case C-362/14) [2015] EUECJ.

Schrems II Facts

In October 2015, Mr Maximillian Schrems, an Austrian lawyer and data privacy campaigner, successfully challenged the validity of the EU-US safe harbor arrangement as a legal basis for transferring personal data from Facebook Ireland to servers belonging to Facebook Inc located in the US (commonly referred to as the Schrems I judgment)

Subsequently, in July 2016, the European Commission adopted a replacement adequacy Decision 2016/1250 approving a new framework for EU-US personal data flows, the EU-US Privacy Shield.

Mr Schrems reformulated his complaint to the Irish Data Protection Commissioner, claiming that the US does not offer sufficient protection for personal data transferred to that country and sought the suspension or prohibition of future transfers of his personal data from the EU to the US, which Facebook Ireland now carries out in reliance on Decision 2010/87 on controller to processor SCCs.

One of Mr Schrems' key concerns was that the US government might access and use EU individuals' personal data contrary to rights guaranteed by the Charter of Fundamental Rights of the EU (Charter) and that EU individuals would have no remedy available to them once their personal data is transferred to the US. Under US law, internet service providers such as Facebook Inc can be required to provide information to various agencies such as the National Security Agency, the Central Intelligence Services and the Federal Bureau of Investigation and it can be further used in various surveillance initiatives such as PRISM and UPSTREAM.

Decision on controller to processor SCCs

The use of SCC’s remains valid but businesses using controller to processor SCCs (or planning to do so) now face additional burdens as they will need to conduct a Transfer Impact Assessment on whether, in the overall context of the transfer, there are appropriate safeguards in the third country for the personal data transferred out of the EU (practically speaking, the European Economic Area). EU data exporters will need to take into account not only the destination of the personal data but also, in particular, any access by public authorities and the availability of judicial redress for individuals, to ascertain whether SCCs are an appropriate mechanism and may need to put in place additional safeguards.

Decision on EU-US Privacy Shield

The limitations on the protection of personal data, transferred from the EU to the US, arising from US domestic law "on the access to and use by US public authorities, are not circumscribed in a way that satisfies requirements that are essentially equivalent to those required under EU law, by the principle of proportionality, in so far as the surveillance programmes based on those provisions are not limited to what is strictly necessary".

As regards the requirement of judicial protection, the ECJ held that the Privacy Shield Ombudsperson does not provide individuals with any cause of action before a body which offers guarantees substantially equivalent to those required by EU law, so as to ensure the independence of the Ombudsperson and the existence of rules empowering the Ombudsperson to adopt decisions that are binding on US intelligence services.

EU-US Privacy Shield - Practical points:

  • The EU-U.S. Privacy Shield is no longer valid and businesses solely relying on it to transfer personal data to the U.S. should rely on another transfer solution, including by putting SCCs in place.
  • While SCCs remain valid, the underlying transfer must be assessed on a case-by-case basis to determine whether the personal data will be adequately protected (e.g. because of potential access by law enforcement or national security agencies). This is, in effect, a Transfer Impact Assessment. This will be burdensome for small organisations but also large ones making hundreds, if not thousands, of transfers.
  • The EU Commission is now likely to issue updated SCCs. Those new clauses could bake in the Transfer Impact Assessment discussed above. While existing SCCs will hopefully be “grandfathered”, business should anticipate changes to their processes for new transfers.
  • The judgment could have a negative impact on any adequacy finding for the UK after the Brexit transition period. While there are material differences between the U.S. and UK surveillance regimes, the judgement will no doubt make the EU Commission more cautious in future adequacy assessments.
  • In the absence of an adequacy finding, transfers of personal data from the EU to the UK will be more difficult post-Brexit as EU businesses will necessarily have to consider the effect of UK government surveillance powers, in particular the Investigatory Powers Act 2016.
  • While the data protection authorities cannot grant a “grace period” as such, they may well take a gradual approach to enforcing these new requirements. As an illustration, when the Safe Harbor was struck down in 2015, data protection authorities indicated they would not take active enforcement for a few months to allow controllers to make new arrangements.

More to come…

With the publishing of updated Standard Contractual Clauses expected and the UK Adequacy decision pending, businesses handling cross-border data transfers to and from the EU or to and from the US need to keep themselves informed of the latest developments. As it stands SCC’s will need to be part of such a cross-border transfer and a ‘Transfer Impact Assessment’ will be a be a new and additional obligation.

If you have any questions on data protection law or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


GDPR Report

GDPR Report: EU Commission’s First Evaluation of the GDPR

On 24th June, just over two years after its entry into application, the European Commission published an evaluation report on the General Data Protection Regulation (the Regulation / GDPR). The GDPR report shows the Regulation has met most of its objectives, in particular by offering citizens a strong set of enforceable rights and by creating a new European system of governance and enforcement.

Scope of the GDPR report

The GDPR proved to be flexible to support digital solutions in unforeseen circumstances such as the Covid-19 crisis. The GDPR report also concludes that harmonisation across the Member States is increasing, although there is a certain level of fragmentation that must be continually monitored. It also finds that businesses are developing a compliance culture and increasingly use strong data protection as a competitive advantage. The GDPR report contains a list of actions to facilitate further the application of the Regulation for all stakeholders, especially for Small and Medium Sized companies, to promote and further develop a truly European data protection culture and vigorous enforcement.

Background to the GDPR report

The General Data Protection Regulation is a single set of rules of EU law on the protection of individuals with regard to the processing of personal data and on the free movement of such data. It strengthens data protection safeguards, provides additional and stronger rights to individuals, increases transparency, and makes all those that handle personal data more accountable and responsible. It has equipped national data protection authorities with stronger and harmonised enforcement powers and has established a new governance system among the data protection authorities. It also creates a level playing field for all companies operating in the EU market, regardless of where they are established, ensures the free flow of data within the EU, facilitates safe international data transfers and has become a reference point at global level

As stipulated in Article 97(2) of the GDPR, the report covers in particular international transfers and ‘cooperation and consistency mechanism', although the Commission has taken a broader approach in its review, in order to address issues raised by various actors during the last two years. These include contributions from the Council, the European Parliament, the EDPB, national data protection authorities and stakeholders. Key findings of the GDPR review are:

Empowering individuals to control their data

The GDPR enhances transparency and gives individuals enforceable rights, such as the right of access, rectification, erasure, the right to object and the right to data portability. Today, 69% of the population above the age of 16 in the EU have heard about the GDPR and 71% of people have heard about their national data protection authority, according to results published last week in a survey from the EU Fundamental Rights Agency. However, more can be done to help citizens exercise their rights, notably the right to data portability.

The application of the GDPR to new technologies

The GDPR report found that the Regulation has empowered individuals to play a more active role in relation to what is happening with their data in the digital transition. It is also contributing to fostering trustworthy innovation, notably through a risk-based approach and principles such as data protection by design and by default.

Enforcement of the GDPR

From warnings and reprimands to administrative fines, the GDPR provides national data protection authorities with the right tools to enforce the rules. However, they need to be adequately supported with the necessary human, technical and financial resources. Many Member States are doing this, with notable increases in budgetary and staff allocations. The GDPR report found that overall, there has been a 42% increase in staff and 49% in budget for all national data protection authorities taken together in the EU between 2016 and 2019. However, there are still stark differences between Member States.

Harmonised rules but still a degree of fragmentation and diverging approaches

The GDPR established an innovative governance system which is designed to ensure a consistent and effective application of the GDPR through the so called ‘one stop shop', which provides that a company processing data cross-border has only one data protection authority as interlocutor, namely the authority of the Member State where its main establishment is located. Between 25 May 2018 and 31 December 2019, 141 draft decisions were submitted through the ‘one-stop-shop', 79 of which resulted in final decisions. However, the GDPR report concludes that more can be done to develop a truly common data protection culture. In particular, the handling of cross-border cases calls for a more efficient and harmonised approach and an effective use of all tools provided in the GDPR for the data protection authorities to cooperate.

Advice and guidelines by data protection authorities

The EDPB is issuing guidelines covering key aspects of the Regulation and emerging topics. Several data protection authorities have created new tools, including helplines for individuals and businesses, and toolkits for small and micro-enterprises. It is essential to ensure that guidance provided at national level is fully consistent with guidelines adopted by the EDPB.

Developing a modern international data transfer toolbox

The GDPR report found that over the past two years, the Commission's international engagement on free and safe data transfers has yielded important results. This includes Japan, with which the EU now shares the world's largest area of free and safe data flows. The Commission will continue its work on adequacy, with its partners around the world. In addition and in cooperation with the EDPB, the Commission is looking at modernising other mechanisms for data transfers, including Standard Contractual Clauses, the most widely used data transfer tool. The EDPB is working on specific guidance on the use of certification and codes of conduct for transferring data outside of the EU, which need to be finalised as soon as possible. Given the European Court of Justice may provide clarifications in a judgment to be delivered on 16 July that could be relevant for certain elements of the adequacy standardthe Commission will report separately on the existing adequacy decisions after the Court of Justice has handed down its judgment.

Promoting convergence and international cooperation in the area of data protection

Over the last two years, the Commission has stepped up bilateral, regional and multilateral dialogue, fostering a global culture of respect for privacy and convergence between different privacy systems to the benefit of citizens and businesses alike. The Commission is committed to continuing this work as part of its broader external action, for example, in the context of the Africa-EU Partnership and in its support for international initiatives, such as ‘Data Free Flow with Trust'. At a time when violations of privacy rules may affect large numbers of individuals simultaneously in several parts of the world, it is time to step up international cooperation between data protection enforcers. This is why the Commission will seek authorisation from the Council to open negotiations for the conclusion of mutual assistance and enforcement cooperation agreements with relevant third countries.

Challenges for small and medium sized enterprises (SME’s)

The GDPR report noted that the Regulation, together with the Free Flow of Non-Personal Data Regulation offers opportunities to companies by fostering competition and innovation, ensuring the free flow of data within the EU and creating a level playing field with companies established outside the EU. The right to portability, coupled with an increasing number of individuals in search of more privacy-friendly solutions, have the potential to lower the barriers to entry for businesses and open the possibilities for growth based on trust and innovation. However, some stakeholders report that the application of the GDPR is challenging especially for small and medium sized enterprises.

SMEs stress in particular the importance and usefulness of codes of conduct which are tailored to their situation and which do not entail disproportionate costs. As regards certification schemes, security (including cybersecurity) and data protection by design are key elements to be considered under the GDPR and would benefit from a common and ambitious approach throughout the EU. The Commission is currently working on standard contractual clauses between controllers and processors, building on the on-going work on the modernisation of the standard contractual clauses for international transfers.

At EM Law we specialise in helping small and medium sized companies comply with the GDPR. If you have any questions on data protection law or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


Digital Marketing

Digital Marketing - Legal Issues

Digital Marketing is a growth industry with legislation struggling to keep up. Unsuprisingly though there are legal issues that digital marketing businesses need to be aware of to remain compliant. The House of Lords' 2018 report "UK advertising in a digital age" noted that digital marketingaccounted for over half of all spending on advertising in the UK for the first time in 2017. This figure is likely to only increase, especially in the aftermath of COVID-19. This article provides some background into the types of digital marketing and some of the legal issues to consider in this context.

Digital marketing formats

The Digital Adspend study produced by industry body the Internet Advertising Bureau (IAB) and accountants PricewaterhouseCoopers, breaks down 2017 digital marketing spend by format, as follows:

Paid-for search: £5.82bn, of which smartphone spend was £2.62bn. This is essentially sponsored search results, where advertisers pay to have their details presented at the top of a search results page or prominently featured elsewhere on the page.

Display: £4.18bn, within which falls:

  • Online video:£1.61bn, of which smartphone spend was £1.17bn. An example is the pre-roll advert which appears before you watch a YouTube clip, or videos which start playing as the page loads or when your mouse scrolls over them.
  • Banners and standard display formats:£1.31bn, of which smartphone spend was £418m. These are the obvious adverts and include those which appear across the top of the screen (banner adverts) or in a sidebar, overlay adverts (which pop up on-screen and have to be clicked to close) and interstitial adverts (full-screen adverts that pop up between expected content, for example before a target page appears on the screen).

Native: £1.03bn, of which smartphone spend was £895m. An advertorial is native advertising, as are adverts which appear to be recommendations by the publisher ("you might also like"), influencer marketing on social media and adverts which appear to be search results.

Classified and other: £1.47bn. Classified advertising is advertising in an online directory or marketplace (for example, Rightmove, Auto Trader and Gumtree).

Commentators note that the biggest increase recently has been in spend on advertising targeting mobile phone users, in particular using a video format.

Key industry players

The CMA's Final Report on its Digital Marketing Market Study estimates that search advertising revenues totalled around £7.3 billion in 2019, of which more than 90% was earned by Google. Total spend on display advertising was worth £5.5 billion, of which it is estimated more than half went to Facebook.

Google receives revenue from its search engine and other brands such as YouTube, Google Maps and Google Play (an app and digital media store). Google sells advertising space on its own and other sites through Google Ads, and provides services to buy and optimise campaigns on Google via its Google Marketing Platform.

Digital Marketing Legal Issues

Adverts must be obviously identifiable as such.

All advertising must be obviously identifiable as advertising. This is a requirement under:

The Consumer Protection from Unfair Trading Regulations 2008 (SI 2008/1277) (CPUT) which implement the Unfair Commercial Practices Directive (2005/29/EC) (UCPD):

  • A failure to identify commercial intent, unless this is already apparent from the context, is a misleading omission.
  • Using editorial content in the media to promote a product where a trader has paid for the promotion without making that clear in the content or by images or sounds clearly identifiable by the consumer (advertorial) is a prohibited commercial practice.
  • Falsely claiming or creating the impression that the trader is not acting for purposes relating to his trade, business, craft or profession, or falsely representing oneself as a consumer is a prohibited commercial practice.

The Electronic Commerce (EC Directive) Regulations 2002 (SI 2002/2013) (E-Commerce Regulations) which implement the E-Commerce Directive (2000/31/EC):

  • Service providers must ensure that any commercial communication provided by them which constitutes or forms part of an information society service (which would include all advertising) is clearly identifiable as a commercial communication.

The UK Code of Non-broadcast Advertising and Direct & Promotional Marketing (CAP Code):

  • Marketing communications must be obviously identifiable as such.
  • Marketing communications must not falsely claim or imply that the marketer is acting as a consumer, or for purposes outside its trade, business, craft or profession; marketing communications must make clear their commercial intent, if that is not obvious from the context.
  • Marketers and publishers must make clear that advertorials are marketing communications; for example, by heading them "advertisement feature".

Information obligations on digital advertisers

Online advertisers need to:

  • Provide certain information about themselves on their websites.
  • Include certain information about themselves and their products in their online adverts.

These obligations, which apply to "information society service" providers, derive from the E-Commerce Regulations which implement the E-Commerce Directive (2000/31/EC) (E-Commerce Directive).

Information advertisers must include on websites

The information the advertiser must include on websites consists of:

  • Its name.
  • The geographic address at which it is established.
  • Details, including an email address, which make it possible to contact the advertiser rapidly and communicate with it in a direct and effective manner.
  • Where the advertiser is registered in a trade (or similar) register available to the public, details of the register in which the service provider is entered and its registration number, or equivalent means of identification in that register.
  • Where the provision of the service is subject to an authorisation scheme, the particulars of the relevant supervisory authority. Advertising itself is not subject to an authorisation scheme in the UK, but the advertiser's business may be.
  • The advertiser's VAT number.
  • Where the advertiser exercises a regulated profession:
  • the details of any professional body or similar institution with which the advertiser is registered;
  • the advertiser's professional title and the EEA state where that title has been granted; and
  • a reference to the professional rules applicable to the service provider in the member state of establishment, and the means to access them.

Information requirements for online adverts

An information society service provider (which includes any online advertiser) must ensure that any commercial communication provided by it as part of an information society service (which would include all digital marketing) shall:

  • Be clearly identifiable as a commercial communication.
  • Clearly identify the person on whose behalf the commercial communication is made.
  • Clearly identify as such any promotional offer (including any discount, premium or gift) and ensure that any conditions which must be met to qualify for it are easily accessible and presented clearly and unambiguously.
  • Clearly identify as such any promotional competition or game and ensure that any conditions for participation are easily accessible and presented clearly and unambiguously.

Digital Marketing: Controls on the use of personal data and online behavioural advertising (OBA)

The digital environment offers advertisers the opportunity to track users' online behaviour to build a profile of their interests and target advertising at them. This practice is known as "online behavioural advertising" (OBA) or sometimes as interest-based advertising (IBA).

Information is generally collected using online identifiers (such as cookies, internet protocol (IP) addresses, radio frequency identification (RFID) tags, advertising IDs, pixel tags, account handles and device fingerprints) which can be used variously to note information such as searches conducted, content viewed, purchases made and the user's location. Data about browsing habits can be combined with information about the user obtained via registrations and purchases.

OBA may be conducted by a website owner solely based on activity on its own site (first-party OBA) or by a third party tracking activity across multiple websites and user devices and serving adverts for products not necessarily sold on the website being viewed (third-party OBA).

Examples of OBA include:

  • Advertising (such as pop-ups and banners) for products a user is likely to be interested in based on their interests, as revealed by their browsing habits or searches.
  • Retargetingof adverts for products a user has viewed, encouraging them to go back and make or complete a purchase.
  • Advertising to a mobile phone promoting a cafe which a user is passing near to.

Advertisers need to be aware that if they have collected personal data at any stage in the process enabling them to target advertising at individuals, they will be classified as a data controller unless they are acting on behalf of another data controller in which case they may be a data processor. A data controller must notify the individuals whose personal data they are using about who they are, what personal data they are collecting and what they are using that data for. They must also only process that data under one of the specified lawful bases. So if, for example, an advertiser is processing personal data relating to an individual’s political or religious beliefs, the advertiser will need to obtain consent to such processing from the individual.

Cookies

The Privacy and Electronic Communications (EC Directive) Regulations 2003 (SI 2003/2426) require the user's consent to the use of non-essential cookies and similar technologies on their devices, including computers or mobiles, but also other equipment such as wearable technology, smart TVs, and connected devices including the ‘Internet of Things’.

If the advertiser’s cookies are collecting personal data then the advertiser will also need to comply with data protection laws as a data controller.

A short introduction

Digital marketing can give rise to many legal issues and what has been mentioned here is only a short overview. The content of adverts and websites and the use of personal data need to be considered from the outset.

EM law are experts in media, technology and data protection law. Please contact us if you need any help with digital marketing legal issues.


Big Data

Big Data – AI and Machine Learning

The use of computers and the internet has allowed unprecedented amounts of data to be collected and used for a variety of ends. Big data technology represents the most advanced and sizeable use of this new asset. The size and extent of such operations come up against a number of regulatory barriers. Most notably the General Data Protection Regulation (EU) 2016/679 (GDPR).

What is Big Data?

Big data is the harnessing, processing and analysis of digital data in huge and ever-increasing volume, variety and velocity. It has quickly risen up the corporate agenda as organisations appreciate that they can gain advantage through valuable insights about their customers and users through the techniques that are rapidly developing in the data world.

Much big data (for example, climate and weather data) is not personal data. Personal data relates to an identifiable living individual. For data that is or could be personal data, data protection legislation in particular the GDPR must be carefully considered.

Brexit

During the transition period (ends 31 December 2020 unless extended) and after organisations should, as the ICO has noted, continue data protection compliance as usual. The key principles, rights and obligations will remain the same and organisations already complying with the GDPR should be in a good position to comply with the post-Brexit data protection regime.

Big Data Analytics, Artificial Intelligence and Machine Learning

Being able to use big data is critical to the development of Artificial Intelligence (AI) and machine learning. AI is the ability of a computer to perform tasks commonly associated with human beings. In particular, AI can cope with, and to a large extent is predicated on, the analysis of huge amounts of data in its varying shapes, sizes and forms.

Machine learning is a set of techniques that allows computers to ‘think’ by creating mathematical algorithms based on accumulated data.

Big data, AI and machine learning are linked as described by the ICO:

“In summary, big data can be thought of as an asset that is difficult to exploit. AI can be seen as a key to unlocking the value of big data; and machine learning is one of the technical mechanisms that underpins and facilitates AI. The combination of all three concepts can be called "big data analytics”. (Paragraph 11 of ICO: Big data and data protection 2017.)

Big data analytics differs from traditional data processing in the following ways:

  • It uses complex algorithms for processing data. This usually involves a “discovery” phase to find relevant correlations (which can be a form of machine learning) so that algorithms can be created.
  • There is limited transparency on how these algorithms work and how data is processed. As vast amounts of data are processed through massive networks, a “black box” effect is created that makes it very difficult to understand the reasons for decisions made by the algorithms.
  • There is a tendency to collect “all the data” as it is more easily available rather than limiting the analytics to random samples or statistically representative samples.
  • Often data is re-used for a different purpose for which it was originally collected, often because it is obtained from third parties.
  • It usually involves data from new sources such as the Internet of Things (IoT) and “observed” data that has been generated automatically, for example by tracking online behaviour rather than data provided by individuals. In addition, new “derived” or “inferred” data produced by the algorithms is used further in the analytics.

Big Data and Data protection

Managing compliance with the GDPR will play a large part in big data management projects involving data harvested from the expanding range of available digital sources. Many organisations will already have an established data protection governance structure and policy and compliance framework in place and these can be helpful as pathfinders towards structured data governance.

Controller or processor?

Under Article 4(7) of the GDPR, a person who determines “the purposes and means” of processing personal data is a controller and under Article 4(8), a processor just processes personal data on behalf of the controller.

Correctly assessing whether an organisation is a controller or a processor in the context of the collection of massive amounts of data is therefore critical to the GDPR compliant structuring of the relationship and to allocating risk and responsibility.

However, the borderline between controller and processor can be fuzzy in practice. Where it lies in the AI context was considered for the first time in the UK in the ICO’s July 2017 decision on an agreement between the Royal Free Hospital and Google DeepMind. Under the agreement, DeepMind used the UK’s standard, publicly available acute kidney injury (AKI) algorithm to process personal data of 1.6m patients in order to test the clinical safety of Streams, an AKI application that the hospital was developing. The ICO ruled that the hospital had failed to comply with data protection law and, as part of the remediation required by the ICO, the hospital commissioned law firm Linklaters to audit the system. The hospital published the audit report in May 2018, which found (at paragraph 20.7) that the agreement had properly characterised DeepMind as a processor not a controller.

Things important to this characterisation were that the algorithm was simplistic and its use had been mandated by the NHS. Understanding whether an organisation is a processor or controller is a complex issue and seeking advice on the matter may be crucial to understand potential liabilities for those using big data.

Personal data

In the context of big data, it is worth considering whether personal data can be fully anonymised, in which case taking it outside data protection requirements. This is noted in Recital 26 of the GDPR which says that:

"the principles of data protection should therefore not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable".

However, personal data which has been pseudonymised, in other words could still identify an individual in conjunction with additional information, is still classed as personal data.

Profiling

The GDPR includes a definition of profiling that is relevant to the processing of big data. Profiling is defined as any form of automated processing of personal data used to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict the following: performance at work; economic situation; health; personal preferences; interests; reliability; behaviour; location; movements. (Article 4(4), GDPR.)

The GDPR includes data subject rights in relation to automated decision making, including profiling. The fact that profiling is taking place must be disclosed to the individual, together with information about the logic involved, as well as the significance and the envisaged consequences for such processing.

Individuals have the right not to be subject to a decision based solely on automated processing (which includes profiling), which produces legal effects concerning them or similarly significantly affects them (Article 22(1), GDPR). However, this right will not apply in certain cases, for example if the individual has given explicit consent, although suitable measures must be implemented to protect the data subjects.

Fair processing

In the ICO Big Data Paper 2017, the ICO emphasises the importance of fairness, transparency and meeting the data subject’s reasonable expectations in data processing. It states that transparency about how the data is used will be an important element when assessing compliance. It also highlights the need to consider the effect of the processing on the individuals concerned as well as communities and societal groups concerned. Similarly, the EDPS 2015 opinion stresses that organisations must be more transparent about how they process data, afford users a higher degree of control over how their data is used, design user friendly data protection into their products and services and become more accountable for what they do.

Transparency

As well as the general requirement for transparency in Article 4(1)(a), the GDPR includes specific obligations on controllers to provide data subjects with certain prescribed information (typically done in the form of a privacy notice) (Articles 13 and 14, GDPR).

The ICO Big Data Paper 2017 notes that the complexity and opacity of data analytics can lead to mistrust and potentially be a barrier to data sharing, particularly in the public sector. In the private sector, it can lead to reduced competitiveness from lack of consumer trust. Therefore privacy notices are a key tool in providing transparency in the data context. In relation to privacy notices, the Paper suggests using innovative approaches such as videos, cartoons, icons and just-in-time notifications, as well as a combination of approaches to make complex information easier to understand.

An introduction

This blog is no more than an introduction and summary of some of the legal issues raised by big data. In many ways the GDPR was created in response to such activity and therefore the extent of its applicability to the topic is unsurprising. Any organisation looking to undertake such a project should be aware of regulations in a way that allows compliance to be built into an operating system.

If you have any questions on data protection law or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


Wm Morrison Supermarkets plc

Data Breach Claims – Wm Morrison Supermarkets plc

In Wm Morrison Supermarkets plc v Various Claimants [2020] UKSC 12, the Supreme Court has overturned judgments of the High Court and Court of Appeal and decided that a supermarket was not vicariously liable for unauthorised breaches of the Data Protection Act 1998 committed by an employee.

Wm Morrison Supermarkets plc v Various Claimants - the facts

In 2013, Mr Skelton, who was then employed by Wm Morrison Supermarkets plc (Morrisons) as an internal IT auditor, was provided with a verbal warning for minor misconduct. Subsequently, he developed an irrational grudge against his employer. After being asked by Morrisons to provide payroll data for the entire workforce to external auditors, Mr Skelton copied the data onto a USB stick. He took the USB stick home and posted the data on the internet, using another employee's details in an attempt to conceal his actions. He also sent this data to three national newspapers, purporting to be a concerned member of the public.

The newspapers did not publish the data, but one newspaper alerted Morrisons, who immediately took steps to remove the data from the internet, contact the police and begin an internal investigation. Morrisons spent £2.26 million dealing with the aftermath of the disclosure, a large proportion of which was spent on security measures for its employees. Mr Skelton was arrested and ultimately convicted of criminal offences under the Computer Misuse Act 1990 and section 55 of the DPA 1998, which was in force at the time.

The claimants in this case were 9,263 of Morrisons' employees or former employees. They claimed damages from Morrisons in the High Court for misuse of private information and breach of confidence, and for breach of its statutory duty under section 4(4) of the DPA 1998. The claimants alleged that Morrisons was either primarily liable under those heads of claim or vicariously liable for Mr Skelton's wrongful conduct.

Data Protection Act 1998

This case was decided under the Data Protection Act 1998 (DPA 1998) which was applicable at the time. The DPA 1998 implemented the Data Protection Directive (95/46/EEC) and imposed broad obligations on those who collect personal data (data controllers), as well as conferring broad rights on individuals about whom data is collected (data subjects). Section 4(4) of the DPA 1998 provided that a data controller must comply with eight data protection principles in relation to all personal data with respect to which they are a controller.

Under section 13(1), any breach of the DPA 1998 which caused damage entitled the victim to compensation for that damage. Section 13(2) provided as follows:

"An individual who suffers distress by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that distress if the individual also suffers damage by reason of the contravention."

Under section 13(3), it was a defence to any proceedings under section 13 for a person, or in this case Morrisons, to prove that they had taken such care as was reasonably required in all the circumstances to comply with the relevant requirement.

Vicarious liability

It was also crucial to consider whether Morrisons could be vicariously liable for their employee’s action in this instance. Employers will be liable for torts committed by an employee under the doctrine of vicarious liability where there is a sufficient connection between the employment and the wrongdoing. There is a two-stage test:

  • Is there a relationship between the primary wrongdoer and the person alleged to be liable which is capable of giving rise to vicarious liability?
  • Is the connection between the employment and the wrongful act or omission so close that it would be just and reasonable to impose liability?

In Lister v Hesley Hall Ltd [2001] UKHL 22, the House of Lords characterised the second stage as a "sufficient connection" test. The question was whether the torts were "so closely connected with [the] employment that it would be fair and just to hold the employers vicariously liable".

In Mohamud v Wm Morrison Supermarkets plc [2016] UKSC 11 (Mohamud), the Supreme Court held that the supermarket was vicariously liable for an employee's unprovoked violent assault on a customer. It found that there was a sufficiently close connection between the assault and the employee's job of attending to customers, such that the employer should be held vicariously liable

Wm Morrison Supermarkets plc - Decision

Morrisons was not vicariously liable for Mr Skelton's actions. It found that the Court of Appeal had misunderstood the principles governing vicarious liability in the following respects:

  • The disclosure of the data on the internet did not form part of Mr Skelton's functions or field of activities. This was not an act which he was authorised to do.
  • Although there was a close temporal link and an unbroken chain of causation linking the provision of the data to Mr Skelton for the purpose of transmitting it to the auditors and his disclosing it on the internet, a temporal or causal connection did not in itself satisfy the close connection test.
  • The reason why Mr Skelton acted wrongfully was not irrelevant. Whether he was acting on his employer's business or for purely personal reasons was highly material.

The mere fact that Mr Skelton's employment gave him the opportunity to commit the wrongful act was not sufficient to warrant the imposition of vicarious liability. It was clear that Mr Skelton was not engaged in furthering his employer's business when he committed the wrongdoing. On the contrary, he was pursuing a personal vendetta. His wrongful conduct was not so closely connected with acts which he was authorised to do that it could fairly and properly be regarded as done by him while acting in the ordinary course of his employment.

Comment

This decision will provide welcome confirmation for employers that they will not always be liable for data breaches committed by rogue employees. It similarly provides helpful clarification for practitioners on the way in which the judgment in Mohamud should be applied in future cases concerning vicarious liability.

The facts in this case were extreme. It seems that Morrisons were wholly unaware of the grudge held by Mr Skelton. Mr Skelton also took extraordinary actions to cover up what he had done and even to frame another employee.

Unanswered questions

Had Morrisons been found vicariously liable for Mr Skelton’s actions, the employees who made the claims would have had to prove that they suffered ‘distress, anxiety, upset and damage’ by the mishandling of their personal information. A supreme court ruling on the issue would have provided a helpful benchmark to those wanting to understand more about how our courts quantify compensation for data breaches.

Moving forward

Employers should take away from the judgment that although this case was decided under the previous data protection regime, the DPA 1998 and the GDPR are based on broadly similar principles. Therefore the GDPR and Data Protection Act 2018 (DPA 2018) will not be a barrier to vicarious liability actions in data privacy proceedings commenced under the current regime.

Additionally, the GDPR makes compliance far more onerous for controllers and risks exposure to the huge revenue-based fines and data subject compensation claims for breaches of the GDPR and DPA 2018. This includes failing to safeguard data to statutory standards and neglect to have governance in place to curb the malicious acts of rogue employees.

The success of Morrisons in bringing to an end the threat under this case of being subject to a group action for compensation follows Google LLC being granted freedom to appeal against the Court of Appeal's order in Lloyd v Google LLC [2019] EWCA Civ 1599 and is another significant development in the progress of representative class actions in the UK legal system.

If you have any questions on data protection law or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


COVID-19 Contact Tracing Apps - Privacy Concerns

Contact Tracing Apps – Privacy Concerns

Contact tracing apps are being developed by governments and private enterprises to fight COVID-19. Their design and use however raise serious privacy concerns.

How do contact tracing apps work?

Contact tracing apps are mobile software applications designed to help identify individuals who may have been in contact with another person.

In the context of COVID-19 this means that anyone with the app who has been diagnosed with the virus or has self-diagnosed can enter that information into the app. Then, via the use of Bluetooth, anyone who has come, or comes, into contact with that diagnosed or self-diagnosed person will be notified by the app. If you are notified of such contact then you can take steps to self-quarantine or otherwise manage your exposure. This all relies upon individuals carrying their mobile phones at all times with Bluetooth activated which has cast doubt on their potential effectiveness.

Why adopt contact tracing apps?

By tracing the contacts of infected individuals, testing them for infection, treating the infected and tracing their contacts in turn, public health authorities aim to reduce infections in the population. Diseases for which contact tracing is commonly performed include tuberculosis, vaccine-preventable infections like measles, sexually transmitted infections (including HIV), blood-borne infections, some serious bacterial infections, and novel infections (e.g. coronavirus).

Privacy issues with contact tracing apps

Numerous applications are in development, with official government support in some territories and jurisdictions. Several frameworks for building contact tracing apps have been developed. Privacy concerns have been raised, especially about systems that are based on tracking the geographical location of app users.

Less intrusive alternatives include the use of Bluetooth signals to log a user's proximity to other mobile phones. On 10 April 2020, Google and Apple jointly announced that they would integrate functionality to support such Bluetooth-based apps directly into their Android and iOS operating systems.

These Bluetooth signals offer greater privacy protection because they operate on an anonymous basis. Therefore someone who comes into contact with an infected person will not have any information besides the fact that they have come into contact with an infected person. Rather than receiving any unnecessary information such as a unique identifying code or the name of the infected person.

ICO’s blog

The Information Commissioner (IC), Elizabeth Denham, has published a blog setting out data protection considerations for organisations using contact tracing and location data technologies in connection with the COVID-19 pandemic.

While the IC is maintaining a pragmatic and flexible approach to data protection compliance during the pandemic, the IC reminds organisations that the public must remain assured that their data will be processed lawfully in connection with the use of technology to track the spread of COVID-19 by individuals.

To help achieve the IC's twin goals of maintaining public trust and promoting compliance, the blog includes a series of questions for organisations to bear in mind when using new technologies to combat the pandemic. It focusses on compliance with data protection requirements under Article 25 of the General Data Protection Regulation ((EU) 2016/679) (GDPR), the data minimisation and storage limitation principles under Article 5(1)and data subject rights generally under the GDPR.

The IC asks organisations to consider the following questions:

  • Have you demonstrated how privacy is built into the processor technology?
  • Is the planned collection and use of personal data necessary and proportionate?
  • What control do users have over their data?
  • How much data needs to be gathered and processed centrally?
  • When in operation, what are the governance and accountability processes in your organisation for ongoing monitoring and evaluation of data processing, that is to ensure it remains necessary and effective, and to ensure that the safeguards in place are still suitable?
  • What happens when the processing is no longer necessary?

The IC extends an offer to assist organisations with these processes, by providing guidance and tools to consider data protection requirements in the planning and development phase for projects adopting new technology, and by performing an audit of the measures and processes implemented by an organisation when the project has become operational.

In practice

The Information Commissioner's Office (ICO) has published a discussion document setting out its expectations and recommended best practice for the development and deployment of COVID-19 contact tracing apps.

The document was published in advance of Information Commissioner Elizabeth Denham's and Executive Director of Technology and Innovation Simon McDougall's appearance before the Human Rights Joint Committee on 4 May 2020 and is intended to help NHSX and other developers of contact tracing apps comply with information provision and data protection by default and design requirements under the GDPR.

Key principles and recommendations for developers to consider include

  • Performing a Data Protection Impact Assessment (DPIA) prior to implementation of the app and refreshing the DPIA whenever the app is updated during its life cycle.
  • Being transparent with users and providing them with clear information about the purpose and design choices for the app and the benefits the app seeks to deliver for both users and the NHS. Users must also be fully informed about the data to be processed by the app before the processing takes place.
  • Complying with data minimisation, retention and security principles under Articles 5(1) and 32 of the GDPR.
  • Ensuring participation is voluntary and users can opt in and out of participation and exercise their data subject rights (including rights of access, erasure, restriction and rectification) with ease. This could involve the developer providing users with a dedicated privacy control panel or dashboard.
  • Relying on valid user consent or an alternative lawful basis under Article 6(1) of the GDPR for the processing of personal data where this is necessary and more appropriate, such as performance of a task in the public interest (particularly where an app is developed by or on behalf of a public health authority).
  • The collection of personal data relating to health shall be allowed only where the processing is either based on explicit consent, is necessary for reasons of public interest in the area of public health, is for health care purposes, or is necessary for scientific research or statistical purposes.

The ICO will keep these recommendations under review and remains open to feedback.

What does this mean for businesses?

If contact tracing apps are designed in line with ICO guidance, businesses looking to monitor employees can have confidence in asking employees to use such apps. In all likelihood the NHSX app will be used in the UK and therefore businesses should be aware of how that app is being developed.

NHSX development

On 12 April 2020, Matthew Hancock, the Minister for Health and Social Care and the politician directly responsible for the NHS, announced that the NHS was developing a mobile app that will allow for contact tracing. The app is being developed by NHSX, a specialist unit responsible for digital transformation in the NHS.

In response to the Information Commissioner’s approach, NHSX has stated that they are prioritising security and privacy in all stages of the app’s design. They are planning to publish their security designs and the source code of the app to demonstrate this. Furthermore, they have confirmed that all data gathered by the app will only be used for NHS care, management, evaluation and research, and that individuals will be able to delete the app and their data at any point.

Constraints

Two key constraints for contact tracing apps to be effective:

  • 80 per cent or more of the UK population who own a smartphone need to download it; and
  • the UK needs to test more than 100,000 people a day.

This is because contact tracing relies on large numbers of citizens being involved in the effort.

Encouraged technology

The UK Information Commissioner, Elizabeth Denham, has been supportive of the development of contact tracing apps. On 17 April she stated that “data protection laws [should] not get in the way of innovative use of data in a public health emergency – as long as the principles of the law (transparency, fairness and proportionality) are applied. The same approach applies to the use of contact tracing applications.”

Even though they are encouraged, organisations developing contact tracing apps and using them need to be conscious of the privacy issues.

If you have any questions on technology law, data protection law or on any of the issues raised in this article please get in touch with one of our data protection and technology lawyers.


European Representative GDPR

European Representative – GDPR After Brexit

What is a “European Representative” and do you need to appoint one? We have received lots of marketing from businesses in France, Germany and other members of the EU encouraging us to sign up to their European Representative Office service so that we can be compliant with GDPR. This article covers the role of the European Representative and addresses the question about whether you need to appoint one now or later.

Do organisations need to appoint a European Representative right now?

No

Do organisations need to appoint a European Representative in the future?

Maybe.

If you are a UK business offering goods or services to individuals in the European Economic Area (EEA) then, after the Brexit transition period ends (31 December 2020), you may need to appoint a European Representative in the EEA because the UK will no longer be within the EEA. This representative would act as the point of contact for your data subjects within the EEA as required by Article 27 of the General Data Protection Regulation (GDPR).

See below for the specific circumstances in which this requirement exists.

Transition period

The UK left the EU on 31 January 2020. From then until the 31 December 2020 the UK will be in a “transition period”. During the transition period EU law will continue to apply in the UK which includes data protection law and no UK organisation will need to appoint a European Representative until after the transition period ends.

European Representatives after the Brexit transition period

Once the transition period ends UK-based data controllers or processors who:

  • are without any offices, branches or other establishments in the EEA

and

  • who are offering goods or services to individuals in the EEA or monitoring the behaviour of individuals located in the EEA

will be required to have a European Representative in the EEA.

Exceptions

There are exceptions to the above requirement where:

  • you are a public authority or body.
  • your data processing is only occasional, presents a low risk to data protection rights of individuals and does not involve the large-scale use of special category or criminal offence data.

Who can be your European Representative?

A European Representative may be an individual or a company or other organisation established in the EEA where a significant portion of the individuals whose personal data you are processing are located. So if a significant portion of your customers are in Greece, your representative should be located in Greece.

One representative can act on behalf of several non-EU controllers and processors. A representative should not, however, be a data protection officer; the draft European Data Protection Board (EDPB) guidance suggests that the roles are incompatible and combining them would be a conflict of interest.

Appointing a European Representative

You will need to authorise the representative, in writing, to act on your behalf regarding your EU GDPR compliance, and to deal with any supervisory authorities or data subjects in this respect.

In practice you should appoint a representative through a service contract.

The appointment of a representative must be in writing and should set out the terms of the relationship. Having a representative does not affect your own responsibility or liability under the EU GDPR.

Although the representative should be located in the Member State in which a significant proportion of your data subjects are located, the representative must remain easily accessible to data subjects located in all relevant Member States.

When the function of being a representative is assumed by a company or any other type of organisation, a single individual should be assigned as a lead contact and person in charge for each controller or processor represented.

The role of the European Representative

  • Perform its tasks according to the written agreement.
  • Facilitate communication between data subjects and the controller or processor.
  • Maintain a record of processing activities under the responsibility of the controller or processor.

Notification of the appointment

You should provide EEA-based individuals, whose personal data you are processing, the contact details of your representative. This may be done by including the details in your privacy notice or in upfront information provided to individuals when you collect their data. You must also make the information easily accessible to supervisory authorities – for example by publishing it on your website.

Liability of European Representatives

In November 2018 the EDPB issued draft guidance that said that supervisory authorities were able to initiate enforcement action (including fines) against a European Representative in the same way as they could against the controller or processor which appointed them:

“To this end, it was the intention to enable enforcers to initiate enforcement action against a representative in the same way as against controllers or processors. This includes the possibility to impose administrative fines and penalties, and to hold representatives liable.”

Given that fines under GDPR can hit €20 million or 4% of global annual turnover (whichever is higher) the EDPB guidance sent shockwaves through the industry with many representatives deciding it wasn’t such a good idea to be a representative after all.

However, in an about-turn in November 2019, the EDPB issued draft guidance which says the intention was:

“To this end, it was the intention to enable supervisory authorities to initiate enforcement proceedings through the representative designated by the controllers or processors not established in the Union. This includes the possibility for supervisory authorities to address corrective measures or administrative fines and penalties imposed on the controller or processor not established in the Union to the representative……The possibility to hold a representative directly liable is however limited  to its direct obligations referred to in articles 30 and article 58(1)(a) of the GDPR.”

Articles 30 and 58.1 simply concern keeping a record of processing activities and providing information to supervisory authorities when ordered to do so.

Summary

Right now, you can ignore those marketing emails about appointing a European Representative but 31 December 2020 will come around soon enough. If you have customers in the EEA but no office, branch or other establishment in the EEA then, as things currently stand, you should be appointing a European Representative before the year ends.

If you have any questions on appointing a European Representative or on data protection generally contact one of our data protection lawyers.


cloud services legal issues

Cloud Services Legal Issues

Cloud services are on the rise – they are highly relevant now and they are the future. In this article we provide a brief overview of some of the legal and commercial issues to consider when using cloud services and dealing with cloud services contracts.

What are cloud services?

Cloud services describe the delivery of technology services via the internet. Cloud users either do not need to purchase or install software at all or, if they do, then only on a small scale using software that is standardised. Cloud users do not have to run their own applications and provide the computing power from their own data centres, benefitting from massive economics of scale and dramatically lowering the cost of IT service provision.

Cloud services on the rise

The UK has seen a rapid adoption of cloud computing in business with Software as a Service the preferred deployment model. Cutting costs and providing mobile working solutions for staff is the main impetus for such innovation. The flexibility and scalability of cloud computing means organisations are happy to trade-off some of the control that exists in traditional services.

The rapid take up of cloud services is not limited to the private sector. The fourth iteration of the pan-government G-Cloud Framework has just been awarded to a wide array of large and small cloud operators.

The nature of cloud service provision means that a number of well-established IT concepts need to be reconsidered and will continue to need consideration as technology is refined. Furthermore, there is increasing regulation of cloud services through a wide variety of legislative provisions that do not specifically relate to cloud service provision but have a considerable impact on cloud service provision.

How cloud service providers operate

Cloud service arrangement are generally paid for on a service basis, which means that the upfront charges for customers and regular upgrade fees associated with more traditional software licensing are avoided.

Some cloud service providers may seek to levy start-up fees or upfront subscription charges to mitigate their own commercial exposure, for example, for any third-party software licensing charges. The most common approach now is a committed term of 1 to 3 years when signing up to an enterprise SaaS service – as suppliers want to be able to recognise revenue in their accounts.

Intellectual property issues

Licensing:

Although cloud services contracts relate to the provision of services rather than to the supply of software to customers, particularly in SaaS arrangements, appropriate software licences still need to be granted to the customer. Where users have online use of software, without a licence this would amount to copyright infringement. The licences are usually very narrowly defined and limited to use of the online application for their own business purposes. Customers have no right to make copies of or modifications or enhancements to the software and they cannot sub-licence to third parties.

The cloud services provider will not always own the intellectual property rights in the software that is the subject of the cloud provision service. Where this is the case the cloud services provider will need to arrange for the right to sub-licence the software to its customers, or for a direct licence to be entered into between the customers and the relevant third-party licensors. For purposes of contractual simplicity, it is preferable (and most common) for the cloud service provider to sub-licence the customer’s use of the third-party software.

Content and Data licensing:

The extent to which cloud services providers can make use of the data that is stored within their systems by their customers has become an important issue as a result of the significant marketplace developments in data analytics, including the use of artificial intelligence. Until data analytics became a mainstream business activity, cloud providers tended to regard their customers’ data storage requirements as being a necessary business overhead as part of the overall cloud arrangement. With data analytics, customer data has become a valuable resource which can be used to provide the basis for value added data analytics derived services.

In the early days of cloud services provision, many standard terms and conditions offered by cloud service providers in the consumer market included a broad licence from the customer to the service provider allowing them to use any content stored on its servers. These licences are often expressed as being perpetual and irrevocable. The uses to which the service provider could make of the content were usually limited but there were often rights to pass the content to third parties and to use it for marketing purposes. Even in the consumer marketplace, there is now considerably more general awareness of data issues, particularly following the Facebook/Cambridge Analytica scandal. In July 2019, the US Federal Trade Commission voted to approve fining Facebook around $5 billion to finally settle the investigation of these issues.

As a result, customers receiving cloud services should carefully consider the licensing provisions that relate to the suppliers’ use of the data that they store as a result of providing the services, particularly in relation to use of personal data, treatment of intellectual property rights and confidentiality. Customers should take particular care in identifying any rights they are agreeing to provide to the service provider. Licences may be implied by necessity or business efficacy, however a better and more certain approach is to have an express licence in place that is broad in scope and covers the full range of likely activities.

Jurisdiction and governing law

It is common for cloud services providers and their customers to be located in different jurisdictions. Where this is the case, two separate issues need to be considered: applicable law and jurisdiction. In each case, the cloud contract may stipulate choice of law and jurisdiction. However, there may also be separate and different rules on applicable law and jurisdiction that apply irrespective of provisions in the contract: data protection is a good example of this, where the GDPR has its own free standing rules.

Which law governs the contract

Usually the contract will state the laws that apply. If it doesn’t then this can be problematic, especially when cloud services are involved. Why? If, for example, the parties to the contract are based within the EU then in a B2B context it will generally be the laws of the place where the cloud services provider bases its servers that will apply. The position is more complex where service data is stored on multiple servers in different jurisdictions.

It is important therefore to ensure that cloud services contracts include a choice of law (and jurisdiction) clause.

Data Protection

When organisations process personal data they do so either as a “data controller” or a “data processor”. Each have different legal obligations when protecting personal data.

The data controller is the organisation that determines the purposes and means of the processing of personal data and is responsible for compliance with data protection law. In cloud services, the UK’s data protection regulator, the ICO, usually views the customer as the data controller, although when the supplier has a large amount of control over the processing of personal data they may be considered a joint data controller.

The data processor is the entity who processes data on behalf of a data controller. The ICO will regard the cloud services provider as a data processor in most cloud services arrangements.

Most obligations around data protection law fall on the data controller therefore, usually, the customer of a cloud services provider. A customer should therefore only allow a cloud services provider to process data on its behalf if it has appropriate organisational and technical measures in place. Special care must also be taken if international data transfers take place in connection with the processing of the customer’s data.

Checklist for cloud services contracts (buyer perspective)

Before signing on the dotted line you should consider:

  • Data storage: where will your data be stored, how is it stored, who has access to it and what security measures are in place.
  • Warranties and indemnities: consider what disclaimers are contained in the agreement and have appropriate indemnities been given for loss of data?
  • Check for hidden costs: monthly service costs may be low for a reason.
  • How will disputes be dealt with: what law applies and where will disputes be heard?
  • Data recovery: what will happen to your data at the end of the contract?

Checklist for cloud services contracts (supplier perspective)

Make sure that you have considered the following:

  • Intellectual Property Rights: although supplying software as a service is more protective of IPRs you should still make sure that your IP rights are covered.
  • Limitations and exclusions of liability: it’s standard practice to exclude liability for certain losses and to have an overall cap on liability.
  • Will you provide support commitments / service availability guarantees? Your business customers may well insist on these.
  • If you offer a subscription per person what happens if unauthorised individuals access the service? Consider including audit rights.
  • What should happen with the customer’s data at the end of the contract – you probably want the right to delete it after a certain time.
  • Choice of law and jurisdiction.

Cloud services – a multifaceted and evolving area of law

Contracts for the provision of cloud services and the legal issues being thrown up by the uptake in could services technology are evolving all the time. If you need help with cloud services contracts or any technology legal issues then please get in touch with us.


COVID-19 Data Protection Issues

COVID-19 Data Protection Issues

COVID-19 data protection issues have left many businesses scrambling to keep on top of their compliance functions. Other businesses are largely ignoring data protection rules – which are you?!

Although not always at the front of minds in a crisis, data protection laws are there to be followed. As a result of COVID-19 data protection rules are being put to the test as a result of new information about individuals being collected in response to the pandemic. This often includes whether individual members of staff are displaying symptoms of the virus, the health status of staff and related individuals within the same household, the results of COVID-19 testing and the various locations individuals have visited since the start of the outbreak.

This new information collected constitutes “personal data” and sometimes falls within “special categories of personal data”, as provided for under Article 9 of the General Data Protection Regulation (EU) 2016/679 (GDPR) and applicable data protection laws.

Regulators Response

Data protection regulators across the EU have issued statements and guidance referring to the effect of COVID-19 on data protection.

The European Data Protection Board (EDPB) has stated that data protection laws in the EU do not, and should not, hinder the response to COVID-19. Therefore organisations subject to such regulation should remain compliant with their obligations under GDPR. The EDPB has commented that the COVID-19 emergency is a “legal condition which may legitimise restrictions of freedoms provided these restrictions are proportionate and limited to the emergency period”. Whether this means governments have the right to police data protection compliance more or less strictly is unclear.

In the UK the Information Commissioner’s Office (ICO) has published guidance in the context of COVID-19 data protection. The ICO’s approach is sympathetic to the challenges faced by organisations:

“We understand that resources, whether they are finance or people, might be diverted away from usual compliance or information governance work. We won’t penalise organisations that we know need to prioritise other areas or adapt their usual approach during this extraordinary period”.

The ICO then goes on to mention that this does not extend as far as allowing infringement of statutory timescales but that they will endeavour to communicate to individuals bringing information rights requests that understandable delays may ensue.

The guidance should not be interpreted as a blank cheque by organisations to bend the rules relating to data protection compliance. It is only guidance and may not stand up in court. Additionally, the ICO does not grant any express relaxation of the rules. It has also stated, in line with the EDPB, that data protection should not stop organisations from being able to respond effectively to the crisis.

“Personal Data” and/or “Special Categories of Personal Data”

Information such as whether personnel have self-isolated, body temperature of personnel, visitors to premises and device location data will all be considered personal data. Where information also relates to the individual’s health, it would also fall within the sub-category of “special categories of personal data” – more on this below.

Legal Basis for Processing Personal Data

When processing COVID-19 personal data (that isn’t “special category data”) organisations may rely on the following legal bases:

Legitimate interests: for the purpose of the organisation’s legitimate interests in managing business continuity and the well-being of its staff.

Contractual necessity: necessary for an organisation’s performance of its obligations to its staff e.g. employees under their employment contract. Relevant obligations include ensuring the health, safety and well-being of employees.

Legal obligation: organisations have legal obligations relating to health and safety.

Legal Basis for Processing Special Categories of Personal Data

It is likely that when responding to the COVID-19 crisis organisations will collect special category data. This is because special category data, within the context of health, is defined as:

“personal data related to the physical or mental health of a natural person, including the provision of health care services which reveal information about his or her health status”.

This includes information on injury, disease, diagnosis, medical history, medical examination data, registration details with health service, appointment details and/or a number, symbol or other identifier assigned to an individual to uniquely identify them for health purposes.

Organisations can only process special category data on one or more of the following grounds:

Employment, social security and social protection obligations: certain obligations under employment, social security and social protection law may allow the processing of special category data. You need to be able to identify the legal obligation or right in question, either by reference to the specific legal provision or else by pointing to an appropriate source of advice or guidance that sets it out clearly. You can refer to a government website or to industry guidance that explains generally applicable employment obligations or rights. In this instance it would be sufficient to refer to the Health and Safety at Work (UK) etc. Act 1974 which states:

it shall be the duty of every employer to ensure, so far as is reasonably practicable, the health, safety and welfare at work of all his/her employees”.

For example, an employer will want to know whether, in light of COVID-19, an individual member of staff is a health risk in order to ensure the health, safety and welfare of that staff member and the other employees. This is likely to include collecting special category health data from a number of individuals. The employer can rely on employment, social security and social protection obligations to do this processing.

On the other hand, if the employer were to collect unnecessary data such as medical information beyond the scope of that required to diagnose COVID-19 within government guidance, or if the employer disclosed the names of people diagnosed when it was unnecessary to disclose such information then these actions would amount to infringements of data protection law.

Preventative or occupational medicine: occupational medicine is a specialist branch of medicine that focuses on the physical and mental wellbeing of employees in the workplace. Under GDPR the processing of special category data is permitted for the purposes of preventative or occupational medicine, the assessment of an employee’s working capacity, medical diagnosis and/or the provision of health care or treatment.

Section 11 of the Data Protection Act (UK) 2018 states that in the UK organisations can only rely on this condition if the information is being processed by a health professional or a social worker professional or another person who in the circumstances owes of a duty of confidentiality under an enactment or rule of law. Therefore, this condition only applies where an organisation has appointed medical or social advisors who are professionals.

So, an organisation can be justified in processing special category data relating to COVID-19 on the advice of its medical advisors but only when able to show that the processing of this specific data is necessary. It must be a reasonable and proportionate way of achieving one of these purposes, and the organisation must not collect more data than it needs.

Public interest in the area of public health: on the advice of public medical advisors it may be possible to process special category data. This condition is only applicable where the processing is by, or under the responsibility of, a health professional or by someone else who in the circumstances owes a legal duty of confidentiality. For example, an organisation is contacted by health professionals who are trying to collect special category data in relation to the COVID-19 crisis to enable statistical analysis of the disease. On the advice of such public medical advisors, an organisation may rely upon the public interest in the area of public health condition when processing special category data for this purpose.

Consent is another legal bases for processing personal data. When collecting data as an organisation about individuals it is better not to rely upon consent because there is a risk of it not being freely given. This is based upon the general view that an inherent imbalance of power exists between individuals and organisations, in favour of organisations. Consent can also be withdrawn at any time.

Proportionate Collection/Processing of Personal Data for Purpose

An important aspect of GDPR compliance is that organisations only collect as much personal data as is strictly necessary for the purposes being pursued.

Within the context of COVID-19 this includes not naming an individual who is a health risk to other individuals or any other sensitive information about that individual in an organisation when it is not strictly necessary. Another example may be when enquiring about those experiencing symptoms within an individual’s household. In this instance it is unlikely that any more information than a simple ‘yes’ or ‘no’ answer would be required.

In addition, organisations should ensure that the personal data that they collect is stored only for as long as necessary.

COVID-19 Data Protection Q&A

Can you tell staff that a colleague may have potentially contracted COVID-19?

Yes. You should keep staff informed about cases in your organisation. But don’t provide any more information than necessary. You have an obligation to ensure the health and safety of your employees, as well as a duty of care. Data protection rules do not prevent you doing this.

Can you collect health data in relation to COVID-19 about employees or from visitors to my organisation? What about health information ahead of a conference, or an event?

You have an obligation to protect your employees’ health and therefore it is reasonable to ask people, be that employees or visitors to your organisations, to tell you if they are experiencing COVID-19 symptoms and hence collect special category data about them. Don’t collect more than you need and ensure that any information collected is treated with the appropriate safeguards and discarded as soon as it becomes obsolete.

For example, the best thing to ask would be a simple yes or no question as to whether an employee or visitor is experiencing COVID-19 symptoms or if anybody in their household is. Gaining any medical information unrelated to COVID-19 or their ability to visit your organisation would be deemed unnecessary.

You could also ask visitors to consider government advice before they decide to come. And you could advise staff to call 111 if they are experiencing symptoms. This approach should help you to minimise the information you need to collect.

Homeworking

Data protection is not a barrier to increased and different types of homeworking. During the pandemic, staff may work from home more frequently than usual and they can use their own device or communications equipment. Data protection law doesn’t prevent that, but you’ll need to consider the same kinds of security measures for homeworking that you’d use in normal circumstances. This includes the potential need to specifically train homeworkers on their obligations and those of the employer in relation to data protection and confidentiality, concerning the procedures which they must follow, and what is, and is not, an authorised use of data.

Should Organisations Consider Undertaking a Data Protection Impact Assessment (DPIA)?

GDPR requires organisations to undertake a mandatory DPIA:

  • if their processing is likely to result in high risk to the rights and freedoms of individuals – this should involve considerations of the likelihood and severity of potential harm. Article 35(3) of the GDPR provides the following examples of when a processing operation is "likely to result in high risks":
  • A systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person.
  • Processing on a large scale of special category data, or of personal data relating to criminal convictions and offences.
  • A systematic monitoring of a publicly accessible area on a large scale.
  • (relevant data to COVID-19) when processing biometric data, genetic data and/or tracking data.
  • The GDPR defines biometric data in Article 4(14) as “personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of a person, such as facial images or dactyloscopic.” A fingerprint would be an example.
  • The GDPR defines genetic data in Article 4(13) as “personal data relating to the inherited or acquired genetic characteristics of a natural person”. A genetic profile of an individual would be an example.
  • Tracking data – an example would if an organisation uses device location data when accessing the geographical implications of COVID-19.

If an organisation has already started to undertake such processing activities or process this kind of data without undertaking a DPIA then they should perform one as soon as possible.

In the context of COVID-19 a DPIA will be necessary if an organisation has processed data in this way or of this nature in response to the pandemic. It is also helpful to know the context in which an organisation would be expected to perform a DPIA so that they can avoid it. Another example might be an organisation who becomes involved in the large scale processing of data in response to the crisis. Such an organisation should be prepared to undergo a DPIA if the nature of this new processing requires it.

Undertaking a DPIA, mandatorily or not, can still be useful for organisations in order to understand potential risks within their data controlling/processing activities.

If you need any help with COVID-19 data protection issues or on any other aspects of data protection law please get in touch with one of our data protection lawyers.